Stereo Nacht
Meh. It could be anything, including a bug in which the shut-down instruction is not properly parsed as such cause part of another problem it is analyzing. Kinda reminds me of one of Isaac Asimov's Robots stories, where AIs are prevented to cause any harm to any human, but they are trying to find a solution to faster-than-light travel.
Stereo Nacht
The problem is, it causes humans to go in a (temporary) non-existence state, so the Russian super-AI broke down on it. So the US guys, who (for some reason) gave a childish character to their own super-AI (seriously, who would want a childish computer playing pranks on them?) they formulate the question differently.
Stereo Nacht
They say it's just a theoretical question, so even if the solution may cause harm, or even death to humans, it will not actually harm a human. And thus, with that work-around, it does find a solution, but it messes up with the childish character, and it does play a "prank" on them.
Stereo Nacht
Sooooo... Yeah. It's all about how it was formulated.
Foggy
Well, that shouldn’t end horribly for us all.
載入新的回覆