So you'd rely on having more time. Would you agree then that if something scaled to superintelligence tomorrow, or got smart enough to start self-improving tomorrow, everyone would be dead? Seems like the sort of important fact you might want to communicate to, say, Congress.
Superintelligence Tomorrow Would Mean Everyone Dead
By
–
Leave a Reply