Specifically it will only be real if it becomes real and you didn’t support it becoming real.
It’s like the inverse of the notion that the proof of God’s omnipotence is that he doesn’t need to exist in order to save you - the whole idea of Roko’s Basilisk is that if the AI super-intelligence machine God comes to be, it might decide to punish everyone who worked against it coming to be, as an incentive for people to help it come to be in the first place. For exactly the right kind of host, this is an effective memetic infohazard, despite essentially being “God will be angry if he don’t assist in his apotheosis”.
Completely ignoring the possibility of “the AI will get angry if we create it, but build it wrong / wastes resources / cause destruction while building it which it decides should’ve been used better”. Like, these guys are explicitly fighting against the goals they claim the AI they’re working towards is supposed to have.
Specifically it will only be real if it becomes real and you didn’t support it becoming real.
It’s like the inverse of the notion that the proof of God’s omnipotence is that he doesn’t need to exist in order to save you - the whole idea of Roko’s Basilisk is that if the AI super-intelligence machine God comes to be, it might decide to punish everyone who worked against it coming to be, as an incentive for people to help it come to be in the first place. For exactly the right kind of host, this is an effective memetic infohazard, despite essentially being “God will be angry if he don’t assist in his apotheosis”.
Completely ignoring the possibility of “the AI will get angry if we create it, but build it wrong / wastes resources / cause destruction while building it which it decides should’ve been used better”. Like, these guys are explicitly fighting against the goals they claim the AI they’re working towards is supposed to have.