October 21, 2017

 Eliezer Yudkovski described how an AI apocalypse could happen. In a 2008 article he wrote: "How likely is it that artificial intelligence will cross the entire wide gap from amoeba to village fool, and then stop at the level of human genius?" And his answer: "From a physical point of view, it will be possible to build a brain that can calculate a million times faster than a human brain... If we accelerate a human brain in this way, a subjective year of thought will shrink to 31 physical seconds in the external world, and a thousand years will pass in eight and a half hours." Illustration: pixabay.

Does artificial intelligence endanger humanity?