What is The Singularity?

"To appreciate the nature and significance of the coming “singularity”, it is important to ponder the nature of exponential growth. Toward this end, I am fond of telling the tale of the inventor of chess and his patron, the emperor of China. In response to the emperor’s offer of a reward for his new beloved game, the inventor asked for a single grain of rice on the first square, two on the second square, four on the third, and so on. The emperor quickly granted this seemingly benign and humble request. One version of the story has the emperor going bankrupt as the 63 doublings ultimately totalled 18-million trillion grains of rice. At 10 grains of rice per square inch, this requires rice fields covering twice the surface area of the earth, oceans included. Another version of the story has the inventor losing his head." Ray Kurzweil


The Technological Singularity is the point in the predicted near future when exponential growth in technology will allow artificial intelligence to exceed human intellectual capacity. It is best described as an intelligence explosion, and the possibilities are unfathomable.


The result of such an intelligence explosion would cause changes to life as we know it far beyond what is foreseeable at present levels of human intelligence.

Most recognised writers on the Technological Singularity, such as Vernor Vinge and Ray Kurzweil, define the concept in terms of the technological creation of super-intelligence. They argue that it is difficult or impossible for present-day humans to predict what human beings' lives will be like in a post-singularity world. Lev Grosman wrote:


The difficult thing to keep sight of when you're talking about the Singularity is that even though it sounds like science fiction, it isn't, no more than a weather forecast is science fiction. It's not a fringe idea; it's a serious hypothesis about the future of life on earth. There's an intellectual gag reflex that kicks in any time you try to swallow an idea that involves super-intelligent immortal cyborgs, but suppress it if you can, because while the Singularity appears to be, on the face of it, preposterous, it's an idea that rewards sober, careful evaluation.


Ray Kurzweil defined the Singularity as the time when the Law of Accelerating Returns has reached such an extreme pace that technological progress will be happening at a seemingly infinite pace, after which we will live in a new world.


Mathematician and author Vernor Vinge wrote an essay in 1993, in which he applied the term to the moment in the future when our technology’s intelligence exceeds our own. For him, it will be a moment when life as we know it will have changed forever and normal rules no longer apply.


Within a quarter century, non-biological intelligence will match the range and subtlety of human intelligence. It will then soar past it because of the continuing acceleration of information-based technologies, as well as the ability of machines to instantly share their knowledge. Intelligent nano-robots will be deeply integrated in our bodies, our brains and our environment, overcoming pollution and poverty, providing vastly extended longevity, full-immersion virtual reality incorporating all of the senses (like The Matrix), “experience beaming” (like “Being John Malkovich”) and vastly enhanced human intelligence. The result will be an intimate merger between the technology-creating species and the technological evolutionary process it spawned. – Ray Kurzweil


The Singularity isn't a new idea. In 1965, British mathematician I.J. Good described something he called an "intelligence explosion":


Let an ultra-intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man, however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion", and the intelligence of man would be left far behind. Thus, the first ultra-intelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.


In 1983, Vernor Vinge greatly popularised Good’s notion of an intelligence explosion:


We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time at the centre of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between ... so that the world remains intelligible.


In his gripping article, 2045: The Year Man Becomes Immortal, Lev Grosman wrote:


So if computers are getting so much faster, so incredibly fast, there might conceivably come a moment when they are capable of something comparable to human intelligence. Artificial intelligence. All that horsepower could be put in the service of emulating whatever it is our brains are doing when they create consciousness — not just doing arithmetic very quickly or composing piano music but also driving cars, writing books, making ethical decisions, appreciating fancy paintings, making witty observations at cocktail parties.


If you can swallow that idea, and Kurzweil and a lot of other very smart people can, then all bets are off. From that point on, there's no reason to think computers would stop getting more powerful. They would keep on developing until they were far more intelligent than we are. Their rate of development would also continue to increase, because they would take over their own development from their slower-thinking human creators. Imagine a computer scientist that was itself a super-intelligent computer. It would work incredibly quickly. It could draw on huge amounts of data effortlessly. It wouldn't even take breaks to play Farmville.


Sources and further reading:











COPYRIGHT © 2015 Singularity Institute Africa