Ray Kurzweil is wrong: The Singularity is not near

By Nathan Pensky , written on February 3, 2014

From The News Desk

Technology culture celebrates "the crazy ones," those who expand our understanding of life, who can forge new realities out of their own imagination. Then again, sometimes "the crazy ones" come up with things that are just plain crazy.

I am thinking of Ray Kurzweil. No doubt he's brilliant. He's the inventor of the flatbed scanner and an innovator of many other technologies, including those involving character recognition, text-to-speech, speech recognition, and the music synthesizer. Currently head of engineering at Google, he's working on natural language recognition for the company, and he's a recipient of the National Medal of Technology and Innovation.

Despite these accomplishments, however, Kurzweil is perhaps best known for his science fiction-sounding predictions about the future, as described in his book, "The Singularity is Near."

He posits that near future technology will advance so profoundly that collective human biology and artificial intelligence will merge into a super-intelligent reality he calls the Singularity, which I'll get to in a moment. Honestly, I can't even get reception on my phone at my house in rural Pennsylvania, and if reading the vile comment threads to media sites like Gawker doesn't convince you that we humans are nowhere near ready to achieve intellectual and physical nirvana with machines, nothing will.

Nevertheless, the Singularity, as Kurzweil describes it, is:

a future period during which the pace of technological advance will be so rapid, its impact so deep, that human life will be irreversibly transformed... The Singularity will represent the culmination of the merger of our biological thinking and existence with our technology, resulting in a world that is still human but transcends our biological roots. There will be no distinction, post-Singularity, between human and machine or between physical and virtual.
Kurzweil estimates that the Singularity will happen in the year 2045.

But that's not all. The Singularity, which is to say a humanity which has merged with technology, will turn its own intelligence into pure energy and saturate the universe at, or beyond, the speed of light.

In the aftermath of the Singularity, intelligence, derived from its biological origins in human brains an its technological origins in human ingenuity, will begin to saturate the matter and energy in its midst. It will achieve this by reorganizing matter and energy to provide an optimal level of spread out from its origin on Earth.

Kurzweil goes on to explain how it will propel information-matter faster than the speed of light and how "the 'dumb' matter and mechanisms of the universe will be transformed into exquisitely sublime forms of intelligence..."

If you think all that sounds weird, you're not alone. Kurzweil's claims have garnered much media attention, as well criticism. After all, it's hard to say that the earth will transform into a pulsating orb of pure intelligence radiating outward at the speed of light and not turn a few heads.

Nevertheless, as trippy as this may sound, there are some biological underpinnings.  My wife Allison Connell is a Cognitive Psychologist and Neuroscientist at Allegheny College, and she tells me that something akin to Kurzweil's transference of matter to "pure energy" happens already in the body. Our sensory systems actually do transform external energy to internal energy in an amazing process called "transduction." For instance, photoreceptors in our eyes absorb photons of different wavelengths (light energy, often straight from the sun itself) and convert that light energy to neural energy.

Way down this process of conversion, the brain creates actual electricity which it sends back out into the world through action potentials in brain cells. This is the signal that EEG records, small voltages of brain electricity leaving the skull, representing that once external energy. Of course, all this is a far cry from Kurzweil's theory of pure intellect-energy.

Another critic calls into question Kurzweil's working Theory of Mind, that which would enable the hybridization of biological intelligence with artificial intelligence, an idea central to Kurzweil's Singularity. Philosopher and psychologist Colin McGinn wrote a negative review for the New York Times Review of Books about Kurzweil's "How To Create a Mind: The Secret of Human Thought Revealed." In this review, McGinn draws attention to the flaw underlying Kurzweil's theory of mind, a theory called PRTM or Pattern Recognition of the Mind.

McGinn claims that Kurzweil has reduced the many different cognitive processes of the human brain to processes of "pattern recognition" that machines are good at. "What has happened is that [Kurzweil] has switched from patterns as stimuli in the external environment to patterns as mental entities, without acknowledging the switch," writes McGinn. (McGinn  positively reviewed Kurzweil's "The Age of Spiritual Machines.")

Pulitzer-prize winning author of the of classic tome on intelligence and consciousness "Godel Escher Bach: An Eternal Golden Braid" and professor of Cognitive Science Douglas Hofstadter is also not a fan. He says of Kurzweil's theories:

If you read Ray Kurzweil's books and [fellow futurist] Hans Moravec's, what I find is that it's a very bizarre mixture of ideas that are solid and good with ideas that are crazy. It's as if you took a lot of very good food and some dog excrement and blended it all up so that you can't possibly figure out what's good or bad. It's an intimate mixture of rubbish and good ideas, and it's very hard to disentangle the two, because these are smart people; they're not stupid... [T]he point is, there doesn't seem to be any discussion anywhere of "Is this good?" It's all "Let's go faster! Faster! Faster!" Well, where are you going? What are you trying to do? And I don't see any asking of these questions.
Kurzweil has gamely addressed such criticisms, even including an entire chapter in "The Singularity is Near" to addressing them. His list of criticisms to the Singularity and the law of accelerating change includes "the criticism for incredulity" (basically people going, "whaaaat?"), "the criticism from failure rates" (that our design and construction of tech could already be approaching its maximum capacity, which certainly would put a crimp on its extending indefinitely to the horizon), and "the criticism for Holism" (which Kurzweil summarizes as "machines are organized as rigidly structured hierarchies of modules, whereas biology is based on holistically organized elements in which every element affects every other"). These are just a few that Kurzweil addresses and refutes, sometimes very compellingly.

But the biggest problem with the concept of the Singularity, in my opinion, is conceptual, one which he doesn't address in the book: his constant conflation of biological evolution with his versions of social and tech "evolution."

Kurzweil has set up a narrative in which biological evolution, cultural development, and the advancement of computing technology are all part of the same immutable force, never mind that the will of human beings factors into the creation of both culture and technology. For Kurzweil, the advance of technology is as inevitable as biological evolution and can be plotted on the same graph.

Central to his Singularity thesis is the concept that the technology and culture that humans make are part of the same process as the bodies they have evolved.

SingularityGraph Kurzweil presents this figure in "The Singularity is Near" to represent and support his view of exponential acceleration of human evolution. But the beginnings of this evolution are biological changes. In the middle you see social changes, such as art and agriculture. And at the end, you see inventions of technology, like television, radio, and computers. All of these are being presented as "human evolution," but they actually represent three categories that operate on different time scales.

Kurzweil shores up his Singularity with a concept he calls "the law of accelerating returns," which states that "the rate of evolutionary processes progresses exponentially" and that "technological evolution is an outgrowth of – and a continuation of – biological evolution." (For a chuckle, follow the previous link and read the second paragraph.) The Singularity is also informed by the concept of ephemeralization, pioneered by Buckminster Fuller in his science-fiction novel "Nine Chains to the Moon," "the ability of technological advancement to do 'more and more with less and less until eventually you can do everything with nothing,'" as well as "negative entropy," or the outflux of disorder to maintain stasis.

A key aspect of Kurzweil's law of accelerating returns is that it is a naturally occurring phenomena, like biological evolution, that it advances outside the will of human beings. But how could it be outside the will of humans when humans are doing all the inventing? A theoretical innovator along these lines, and in fact someone cited by Kurzweil as "the first Singularity theorist," was Henry Adams, historian and descendent of US Presidents John Adams and John Quincy Adams.

In Adams' paper "A Law of Acceleration," he outlines how the "coal-output of the world, speaking roughly, doubled every ten years between 1840 and 1900, in the form of utilised [sic] power, for the ton of coal yielded three or four times as much power in 1900 as in 1840." Adams surmises technological advances compatible with Kurzweil's "exponential growth," going so far as to state: "A law of acceleration, definite and constant as any law of mechanics, cannot be supposed to relax its energy to suit the convenience of man."

Adams' observation about the production of coal in the United States is prescient of another oft-cited theorem in entrepreneurial circles, and by Kurzweil, concerning integrated circuits specifically, and the advance of technology in general. This theorem, called Moore's Law, "observes that, over the history of computing hardware, the number of transistors on an integrated circuits doubles every two years."

Moore's Law has been appropriated by tech culture to represent a standard rate of advancement of technology itself (never mind that "advancement" along these lines is little more than "whatever happens to occur in tech"). A key aspect of this dynamic is its immutable progress irregardless of the wills of individuals. Adams' coal metaphor and the entrepreneurial application of Moore's Law entail that acceleration of technological advancement will forge ahead, whether any actual people want it that way or not. It "doesn't suit the convenience of man."

And because this immutable acceleration of tech kinda sorta resembles biological evolution, to Kurzweil, they are one and the same thing.

At one point in "The Singularity is Near," Kurzweil states that the end result of biological evolution is the survival of a species, and that the end result of tech advancement, which he calls "evolution," is to fulfill a parameter set by the designer called the "utility function." But in order to place biological and social/technological "evolution" within the same narrative of exponential growth, he merges their respective results into an extremely convoluted metric he calls "order."

For Kurzweil, "order [is] information that fits a purpose. The measure of order is the measure of how well the information fits the purpose." Except such a "measure" is not really very clear, certainly not falsifiable. Which is to say, when you can "order" anything to fit your purpose, how could you ever show it to be wrong? This is perhaps why Kurzweil's critics are often scientists who subscribe to the falsifiability criterion of scientific theory.

Kurzweil even says, "For order, we need a measure of of 'success' that would be tailored to each situation." So "order" is determined by "success," and "success" is determined on a case by case basis according to...basically whatever serves Kurzweil's argument. Apparently the most ordered and successful of all scenarios involves the hybridization of humanity with tech along a timeline determined by benchmarks in technology. Thus, "evolution" can means whatever Kurzweil wants it to mean.

If the joining of these disparate types of "evolution" doesn't work, then the exponential growth schema Kurzweil constantly references falls apart. The beginning of Kurzweil's exponential graph of the "law of accelerating returns" includes biological evolutionary growth at the left end, like the development of the opposable thumb, which takes a long time. The middle of the graph includes social and cultural development, like the emergence of agriculture, writing, and art, which moves along a much faster timeline. The end of the graph includes computing advancement, which moves even faster.

So the graph takes an exponential curve not because humans have moved inexorably along a track of "accelerating returns," but because Kurzweil has "ordered" data points that reflect the narrative he likes.

It's no doubt true that the speculative inquiry that informed Kurzweil's creation of the Singularity also informed his prodigious accomplishment in the invention of new tech. But just because a guy is smart doesn't mean he's always right. The Singularity makes for great science-fiction, but not much else.

[Figure from Wikimedia]