On the one hand, it’s possible we’ll develop ASI that’s like a god in a box, bringing us a world of abundance and immortality. But on the other hand, it’s very likely that we will create ASI that causes humanity to go extinct in a quick and trivial way.
”AI Revolution 101”, Pawel Sysiak
Even if I’ve studied it, I’ve never been interested by AI before. As the computer power continue to grow, I think that it may be possible we achieve AGI, even accidental ASI, by the end of my lifetime.
We need to discuss about it.