greater-than-human intelligence
Copyright 2012 © Ronald D. Isom, Sr.
Technological singularity: "hypothetical future emergence of greater-than-human intelligence through technological means, very probably resulting in explosive superintelligence"
Copyright 2012 © Ronald D. Isom, Sr.
"The risks in developing superintelligence include the risk of failure to give it the supergoal of philanthropy. One way in which this could happen is that the creators of the superintelligence decide to build it so that it serves only this select group of humans, rather than humanity in general."
No comments:
Post a Comment