I’m watching this sort-of documentary about Isaac Asimov. And I’m sure I’ve heard things like this lots of times before, but now I’m thinking about it.
People say things like “What if computers become more complex than human minds? What if humans become obsolete?”
Well, that’s only a problem if you think being less intelligent than other people makes you worthless.
Robots may well end up with more abilities than we have, but that’ll only lead them to get rid of us if we teach them eugenics.