БЛОГ

Mar 17, 2014

Book Review: The Singularity is Near by Ray Kurzweil (2005)

Posted by in categories: human trajectories, singularity, transhumanism

Originally published at h+ Magazine

Ray Kurzweil’s well-received book, The Singularity is Near, is perhaps the best known book related to transhumanism and presents a view of inevitable technological evolution that closely resembles the claim in the later (2010) book What Technology Wants by Wired co-founder Kevin Kelly.

Kurzweil describes six epochs in the history of information. Each significant form of information is superseded by another in a series of stepping stones, exposing a universal will at work within technology towards extropy (this is seen by Kevin Kelly as intelligence and complexity attaining their maximum state possible). The first epoch is physics and chemistry, and is succeeded by biology, brains, technology, the merger of technology and human intelligence and finally the epoch in which the universe “wakes up”. The final epoch achieves what could be called godhood for the universe’s surviving intelligences (p. 15).

Artificial intelligence, which Kurzweil predicts to compete with and soon after overtake the human brain, will mean reverse-engineering the human brain as a direct offshoot of developing higher resolution when scanning the brain (much as genome synthesis was the offshoot of being able to sequence a complete genome) (p. 25–29, 111–198). This is a source of particular excitement to many, because of Kurzweil and Google’s genuine efforts to make it a reality.

An interest in abundance and a read of J. Craig Venter’s Life at the Speed of Light will make Chapter 5 of Kurzweil’s book of particular interest, as it discusses genetics and its relationship to the singularity. Genetics, nanotechnology and robotics are seen as overlapping revolutions that are set to characterize the first half of the Twenty-First Century (p.205). Kurzweil addresses the full understanding of genetics, e.g. knowing exactly how to program and hack our DNA as in J. Craig Venter’s synthetic biology revolution (p. 205–212).

Kurzweil predicts “radical life-extension” on top of the elimination of disease and expansion of human potential through the genetics advancements of teams like J. Craig Venter’s. J. Craig Venter covered life extension and human enhancement in his 2013 book, but also drew special attention to the ongoing engineering of beneficial microbes for purposes of making renewable resources and cleaning the environment. Another prospect for abundance noted by Kurzweil is the idea of cloning meat and other protein sources in a factory (this being an offshoot of medical cloning advances). Far from simply offering life extension to the privileged few, Kurzweil notes that such a development may have the potential to solve world hunger.

To cover the nanotechnology revolution, Kurzweil visits nanotechnology father K. Eric Drexler’s assessments of the pros and cons in this field. In some ways, Kurzweil could be faulted for expecting too much from nanotechnology, since his treatment of the subject contrasts sharply with Drexler’s characterization of it as simply being “atomically precise manufacturing” (APM) and primarily having industrial ramifications. In Radical Abundance, Drexler specifically discourages the view echoed by Kurzweil of “nanobots” swimming in our body in the near future and delivering miracle cures, seeing such expectations as the product of sci-fi stories and media hype.

On the subject of artificial intelligence, there can be no doubt that Kurzweil is ahead of all of us because of his personal background. In his estimate, artificial intelligence reverse-engineered from the human brain will immediately “exceed human intelligence” for a number of reasons even if we only design it to be on par with our intelligence. For example, computers are able to “pool their resources in ways that humans cannot” (p. 259–298). In addition, Kurzweil forecasts:

The advent of strong AI is the most important transformation this century will see. Indeed, it is comparable in importance to the advent of biology itself. It will mean the creation of biology that has finally mastered its own intelligence and discovered means to overcome its limitations. (p. 296)

From our viewpoint in 2014, some of Kurzweil’s predictions could be criticized for being too optimistic. For example, “computers arriving at the beginning of the next decade will become essentially invisible, woven into our clothing, embedded in our furniture and environment”, as well as providing unlimited Wi-Fi everywhere (p. 312). While no doubt some places and instruments exist that might fit this description, they are certainly not in widespread use at this time, nor is there any particular need among society for this to become widespread (except perhaps the Wi-Fi).

Another likely over-optimistic prediction is the view that “full-immersion virtual reality” will be ready for our use by the late 2020s and it will be “indistinguishable from reality” (p. 341). In Kurzweil’s prediction, by 2029 nanobots in our bodies will be able to hack our nervous systems and trick us into believing a false reality every bit as convincing as the life we knew. We are in 2014. There is no full-immersion virtual reality system based on nanotechnology set to be on the market in 2020. A few dedicated gamers have the Oculus Rift (of which there will no doubt be a constant stream of successors ever reducing weight, trying to look “sexier”, and expanding the resolution and frame-rate over at least one decade), while there is no sign whatsoever of the nanotechnology-based neural interface technology predicted by Kurzweil. If nanotech-based full-immersion virtual reality is going to be possible in the 2020s at all, there ought to at least be some rudimentary prototype already in development, but (unless it is a secret military project) time is running out for the prediction to come true.

Part of the book addresses the exciting possibilities of advanced, futuristic warfare. The idea of soldiers who operate robotic platforms, aided by swarms of drones and focused on disrupting the enemy ability to communicate is truly compelling – all the more so because of the unique inside view that Kurzweil had of DARPA. Kurzweil sees a form of warfare in which commanders engage one another in virtual and physical battlefields from opposite sides of the globe, experiencing conflicts in which cyber-attack and communication disruption are every bit as crippling to armies as physical destruction (p. 330–335). Then again, this trend (like the idea of building missile-defense shields) may ultimately lead to complacency and false assumptions that our security is “complete”, while that foreign suppliers like Russia and China are also modernizing and have many systems that are thought to be on par with the US. A lot of US military success may be down to picking on vulnerable countries, rather than perfecting a safe and clean form of warfare (most of Saddam’s deadliest weapons were destroyed or used up in the First Gulf War, which alone could account for the US having so few casualties in the 2003 war.)

Although saying that the singularity will eliminate the distinction between work and play by making information so easily accessible in our lives, Kurzweil predicts that information will gain more value, making intellectual property more important to protect (p. 339–340). This sentiment is hard to agree with at a time when piracy and (illegally) streaming video without paying is already increasingly a fact of everyone’s life. If all thought and play is going to qualify as a creative act as a result of our eventual integration with machines, it only becomes ever harder to believe that such creative acts are going to need monetary incentives.

The book discusses at length how to balance the risks and benefits of emerging technologies. Of particular resilience is Kurzweil’s view that relinquishing or restraining developments can itself expose us to existential risks (e.g. asteroids). I myself would take this argument further. Failing to create abundance when one has the ability to do so is negligent, and even more morally questionable than triggering a nanotech or biotech disaster that must be overcome in the course of helping people.

Kurzweil goes through what seems like an exhaustive list of criticisms, arming singularitarians with an effective defense of their position. Of interest to me, as a result of penning a response to it myself, was how Kurzweil rebuts the “Criticism from the Rich-Poor Divide” by arguing that poverty is overwhelmingly being reduced and benefits of digital technology for the poor are undeniable. Indeed, among the world’s poor, there is no doubt that digital technology is good and that it empowers people. Anyone who argues this revolution is bad for the poor are just plainly ignoring the opinions of the actual poor people they claim to be defending. There has been no credible connection between digital technology and the supply of disproportionate benefits to wealthy elites. If anything, digital technology has made the world more equal and can even be regarded as part of a global liberation struggle.

Unfortunately, there is a major argument absent from the book. Kurzweil’s book precedes the revelations of mass surveillance by NSA whistleblower Edward Snowden. As a result, it fails to answer the most important criticism of an imminent singularity I can think of. I would have to call this the “Argument from Civil and Political Rights”. It takes into account the fact that greedy and cruel nation-states (the US being the most dangerous) tend to seek the monopoly of power in the current world order, including technological power. By bridging the gap between ourselves and computers before we create a more benevolent political and social order with less hegemony and less cruelty, we will simply be turning every fiber of our existence over to state agencies and giving up our liberty.

Suppose PRISM or some program like it exists, and my mind can be read by it. In that case, my uploaded existence would be no different from a Gitmo detainee. In fact, just interfacing with such a system for a moment would be equivalent to being sent to Gitmo, if the US government and its agencies exist. It does not matter how benevolent the operators even are. The fact that I am vulnerable to the operators means I am being subjected to a constant and ongoing violation of my civil rights. I could be subjected to any form of cruelty or oppression, and the perpetrator would never be stopped or held accountable.

It gets worse. With reality and virtual reality becoming indistinguishable (as predicted in this book), a new sort of sadist may even emerge that does not know the difference between the two or does not care. History has shown that such sadists are most likely to be the ones who have had more experience with and thus have obtained more power over the system. It is this political or social concern that should be deterring people from uploading themselves right now. If we were uploaded, what followed could never evolve beyond being a constant reflection of the flawed social order at the time when the upload occurred. Do we want to immortalize an abusive and cruel superpower, corporate lobbyists, secret police, or a prison? Are these things actually worth saving for all eternity and disseminating across the universe when we reach the singularity?

Despite the questions I have tried to raise in this review, I am still convinced by the broad idea of the singularity, and Kurzweil articulates it well. The idea, as promoted by Max More and quoted by Kurzweil (p. 373) that our view of our role in the universe should be like Nietzche’s “rope over an abyss” trying to reach for a greater existence, with technology playing a key role, helps encourage us to take noble risks. However, I believe the noble risks are not risks taken out of desperation to extend our lives and escape death, or risks taken to make ourselves look nice or something else petty. Noble risks are taken to ensure our future or the future of humanity, often at the expense of the present.

I would discourage people from trying to hasten the singularity because of a personal fear of their own death, as this would probably lead to irrational behavior (as occurs with the traditions that promote transcending death by supernatural means). Complications from society and unforeseen abuses, especially by our deeply paranoid and controlling states that are far too primitive to react responsibly to the singularity, are likely to slow everything down.

###

Editors note: concerns about virtual imprisonment or torture are not entirely unfounded, see for example this older article as well as this recent development.

4

Comments — comments are now closed.


  1. Eddie says:

    This review was fine until it turned into an anti-American rant.

  2. Michael LEE says:

    Thanks for a fascinating and balanced critique of an important text. Since the brain is the most complex phenomenon yet discovered in the universe I agree with Harry about watching out for the fallacy of techno-optimism, especially when looking ahead to AI systems. Why invest R & D dollars in trying to recreate AI when we haven’t yet learnt to use all our existing human brain power? IQ comes before AI. Great review!

  3. “The Singularity Is Near is not the type of book one peruses and ignores. Based on outright hard science, hi-tech and advanced math, the pervasive critical analytical techniques of Kurzweil have been extremely helpful to me and I will certainly recommend him to others. As the future is overtly hidden and yet in front of us, he will show us how exactly to devise the detailed pathway to get it right and thrive.”

  4. Warfare? You mean with all that intelligence we will still have war? Why, other than as some sort of game?