Soldato
- Joined
- 18 May 2010
- Posts
- 23,623
- Location
- London
Last edited:
And other top scientists go the other way.
In Michio Kaku book the future of the mind he talks about this. As ling as they are designed in the correct way, it is not an issue. And I agree. We can put save guards in place.
We are a long way from self evolving AI let alone one that might try to protect itself let alone have the capabilities to do that. Not to say it won't ever be a problem or something we shouldn't be wary of.
Isaac Asimov had the AI thing covered years ago.
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Presumably you haven't read many of his books, or even seen the film iRobot, because, from what little I have read and watched, I get the impression that a lot of his books are based on the flaws within these rules. They are definitly a good starting point though
I know what you mean, but if I was to be very pedantic I could say that we have had self evolving AI in a very basic form for ages. I've made an evolutionary algorithm where the evolutionary process alters the parameters of the algorithm itself for a uni assignment, and that just about qualifies as self evolving AI (since it was evolving a control system for a simple virtual robot)
Isaac Asimov had the AI thing covered years ago.
1.A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2.A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
What happens if the robot has to kill a human to stop another human from killing him? Stack overflow? That's why rule 1 and 2 fail. 3? Protect its existence? I don't like the sound if that at all.