Robots probably won’t kill people, but people could kill people with robots.
Here’s what the letter said:
If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.
The Future of Life Institute, the volunteer-backed research organization which posted the letter, aims to “maximize the future benefits of AI while avoiding pitfalls,” according to its website. In January, it published an open letter, also signed by Musk, Hawking and Wozniak, on the need to make AI systems both “robust and beneficial.” Read more…