Quote:
Originally Posted by jwb
well yeah I'm concerned about climate change as potential dead end for us
It's not like I rule out human Extinction. Like I said before I believe it we ever engineer true AGI it will likely render us obsolete and we might go extinct from that as well.
Ant said they won't want to compete with us but I tend to think Intelligence probably means autonomy which means they will have their own interests in what to do with the reasources around them and thus are more likely to possibly render us extinct not necessarily intentionally but rather as a byproduct of their own dominance over the environment which we used to control. The same way we drive species extinct today.
|
Adjusted Gross Income, you say?
Any kind of repercussion like you describe would more likely come from poorly designed AI than advanced systems, even if that poor design is in the AI programming the AI. I think we can have enough oversight and separation of function between AI systems to ward off any threat coming directly from AI. There are a lot of obvious threats with AI's misuse that push the human extinction question to the backburner.
At the end of the day we can just have a human decide to flip the switch so that human extinction has the dignity of being self-induced.
Quote:
Originally Posted by OccultHawk
Human population bottlenecked to under 20,000 under 100,000 years ago.
|
Exactly.