The singularity, in the context of AI, is a theoretical event whereby an intelligent system with the following criteria is deployed.
- Capable of improving the range of its own intelligence or deploying another system with such improved range
- Willing or compelled to do so
- Able to do so in the absence of human supervision
- The improved version sustains criteria (1) through (3) recursively
By induction, the theory then predicts that a sequence of events will be generated with a potential rate of intelligence increase that may vastly exceed the potential rate of brain evolution.
How obligated this self-improving entity or population of procreated entities would be to preserve human life and liberty is indeterminate. The idea that such an obligation can be part of an irrevocable software contract is naive in light of the nature of the capabilities tied to criteria (1) through (4) above. As with other powerful technology, the risks are as numerous and far-reaching as the potential benefits.
Risks to humanity do not require intelligence. There are other contexts to the use of the term singularity, but they are outside of the scope of this AI forum but may be worth a brief mention for clarity. Genetic engineering, nuclear engineering, globalization, and basing an international economy on a finite energy source being consumed thousands of times faster than it arose in the earth — These are other examples of high-risk technologies and mass trends that pose risks as well as benefits to humanity.
Returning to AI, the major caveat in the singularity theory is its failure to incorporate probability. Although it may be possible to develop an entity that conforms to criteria (1) through (4) above, it may be improbable enough so that the first event occurs long after all the current languages spoken on Earth are dead.
On the other extreme of the probability distribution, one could easily argue that there is a nonzero probability that the first event already occurred.
Along those lines, if a smarter presence where already existent on the Internet, how likely would it be that it would find it in its best interest to reveal itself to the lower human beings. Do we introduce ourselves to a passing maggot?