"God creates man. Man destroys God. Man creates dinosaurs..."
"Dinosaurs eat man. Woman inherits the earth..."
When we create the first superintelligent entity, we might make a mistake and give it goals that lead it to annihilate humankind, assuming its enormous intellectual advantage gives it the power to do so. For example, we could mistakenly elevate a subgoal to the status of a supergoal. We tell it to solve a mathematical problem, and it complies by turning all the matter in the solar system into a giant calculating device, in the process killing the person who asked the question.
Subscribe to:
Post Comments (Atom)
7 comments:
actually I like the idea of the supercomputer. I think that would make a very very interesting Sci-fi short story.. kinda like the 9 billiion names of god. (a great one by Arthur C. Clarke if I remember)
and incase you never read it:
http://www.geocities.com/rojodos/docs/9000000000.htm
For some reason, on the way out to KC...my mind was wondering I and got to thinking about technological singularity. I honestly don't know what's wrong with me, but when I woke up this morning I couldn't get it out of my head...and I came upon that quote...anyway, I'm a weird boy (probably why I don't ever seem to fit in).
Thanks for the link Jimu.
BTW--that whole bit about dinosaurs is from JURASSIC PARK.
Ahh there was a whole bit I couldn't see.
Dang.
That explains a lot.
MY turn for a story:
http://www.multivax.com/last_question.html
Yeah, I was playing around one day and I figured out how to do that...
I think I've read the story you linked to before, that's the one where they ask the supercomputer how to avoid entropy, right? And the computer's "solution" is to basically re-create the universe (I think it even says "let there be light" like God in the Bible's beginning--which as we all know is the best part).
Post a Comment