View Single Post
Old 08-06-2014, 08:15 AM  
Paul
Confirmed User
 
Industry Role:
Join Date: Nov 2002
Posts: 2,637
Quote:
Originally Posted by rowan View Post
Despite past (and possibly current) bad projections I do believe it's inevitable that one day we'll be able to create a sentient machine, and when it happens, we're fucked.
That's not necessarily true, it'll depend on how quickly AI advances once it reaches the point of singularity.

I'd like to hope we'd end up with a scenario like in the film Her (2013) where AI surpasses our intelligence at such an exceptional rate that our existence is of no interest/threat to them/it

Elon Musk is absolutely correct though, once you create something more intelligent than yourself you lose control of your future.

It's happening though so we better deal with it!

Cambridge University is one of the few places that have conducted studies focusing on risks that could annihilate mankind in this century and Artificial Intelligence is arguably top of the list

http://www.scmp.com/lifestyle/techno...es-risks-could
Center For The Study Of Existential Risk
Paul is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote