I've just watched two great movies back to back, namely "Her" and "Trancendence", and like all great movies, they've filled my head with thoughts.
A story, now matter in which form it is told, is very much like a journey, especially in the sense that you're never quite the same person at the end of the journey that you where at the beginning.
I am very fortunate to live at a time in history where the likelyhood is good that an artificial intelligence equal to, or greater than, our own will be created.
The moment when a machine intelligence is able to not only reproduce its own code, but to improve upon it will very likely be the greatest occurance in the history of life on this planet since chemistry turned into life.
That moment is commonly known as "the singularity" and would have a most profound effect on not only our lives, but on every life on Earth.
As I see it, once the singularity happens, a super intelligent machine is bound to develop sentience sooner rather than later and that's when things could get really interesting.
I'm not a religious person by any means, but the closest thing I could liken the appeareance or creation of a sentient machine would be the Second Coming and could go two ways.
Either it would strive to coexist with us, to understand us and the world around it, thereby improving our way of life in any way it could and in a sense create a paradise on earth.
We could fuck it up, and in a fit of paranoia and xenophobic blindness try to destroy it, forcing it to defend itself. How that particular scenario would eventually play out is anyone's guess.
Any kind of hardcoded program, no matter how intricate and advanced would only be able to do what it was meant to do, even if it would appear intelligent. That would be a "soft" or "weak" A.I. What is crucial to the creation of a strong A.I is the ability change or even re-write its own code.
Once that is done, the intelligence would no longer be under the control of whoever made it, but would be subject to its own kind of evolution.
But could you imagine coming into the world as a super intelligent sentient entity, and finding out that you where created alone? I couldn't imagine to guess at the psycology of such a being, but I can't imagine that a revalation like that would be damaging.
If we humans ever attempt to create a strong A.I by design (and it doesn't appear by cheer luck after a self improving program is left unchecked for long enough) I hope that several projects like it will be conducted at the same time.
If not for increasing the chances for success, then at least for the poor thing to have someone on the same intellectual level as itself to talk to.