БЛОГ

Archive for the ‘singularity’ category: Page 4

Mar 25, 2024

Vinge—Technological-Singularity-Tx5uZ.pdf

Posted by in category: singularity

Vernor Vinge the coming technological singularity 1993.


Shared with Dropbox.

Mar 25, 2024

Metal (Remix)

Posted by in category: singularity

In honor of Vernor Vinge father of the technological singularity hypothesis.

Mar 24, 2024

Vernor Vinge, father of the tech singularity, has died at age 79

Posted by in categories: robotics/AI, singularity

Vinge won multiple Hugo awards and created a sci-fi concept that drives AI researchers.

Mar 22, 2024

Is Singularity here?

Posted by in categories: Ray Kurzweil, robotics/AI, singularity

One of the most influential figures in the field of AI, Ray Kurzweil, has famously predicted that the singularity will happen by 2045. Kurzweil’s prediction is based on his observation of exponential growth in technological advancements and the concept of “technological singularity” proposed by mathematician Vernor Vinge.

Mar 22, 2024

Technological singularity

Posted by in categories: cosmology, Ray Kurzweil, singularity

It is with sadness — and deep appreciation of my friend and colleague — that I must report the passing of Vernor Vinge.


The technological singularity —or simply the singularity[1] —is a hypothetical future point in time at which technological growth becomes uncontrollable and irreversible, resulting in unforeseeable consequences for human civilization.[2][3] According to the most popular version of the singularity hypothesis, I. J. Good’s intelligence explosion model, an upgradable intelligent agent will eventually enter a “runaway reaction” of self-improvement cycles, each new and more intelligent generation appearing more and more rapidly, causing an “explosion” in intelligence and resulting in a powerful superintelligence that qualitatively far surpasses all human intelligence.[4]

Continue reading “Technological singularity” »

Mar 15, 2024

The Political Singularity and a Worthy Successor, with Daniel Faggella

Posted by in categories: robotics/AI, singularity

Calum and David recently attended the BGI24 event in Panama City, that is, the Beneficial General Intelligence summit and unconference. One of the speakers we particularly enjoyed listening to was Daniel Faggella, the Founder and Head of Research of Emerj.

Something that featured in his talk was a 3 by 3 matrix, which he calls the Intelligence Trajectory Political Matrix, or ITPM for short. As we’ll be discussing in this episode, one of the dimensions of this matrix is the kind of end goal future that people desire, as intelligent systems become ever more powerful. And the other dimension is the kind of methods people want to use to bring about that desired future.

Continue reading “The Political Singularity and a Worthy Successor, with Daniel Faggella” »

Mar 9, 2024

10 Ways Science Fiction Got High Tech Wrong

Posted by in categories: singularity, transportation

Or did it? From flying cars to the Singularity, here’s how some of the most popular visions of the high-tech future are panning out today.

Mar 6, 2024

AI singularity may come in 2027 with artificial ‘super intelligence’ sooner than we think, says top scientist

Posted by in categories: robotics/AI, singularity

We could build an AI that demonstrates generalized, human-level intelligence within three to eight years — which may open the door to a “super intelligence” in a very short space of time.

Mar 5, 2024

AGI in 3 to 8 years

Posted by in categories: cyborgs, economics, employment, internet, robotics/AI, singularity

When will AI match and surpass human capability? In short, when will we have AGI, or artificial general intelligence… the kind of intelligence that should teach itself and grow itself to vastly larger intellect than an individual human?

According to Ben Goertzel, CEO of SingularityNet, that time is very close: only 3 to 8 years away. In this TechFirst, I chat with Ben as we approach the Beneficial AGI conference in Panama City, Panama.

Continue reading “AGI in 3 to 8 years” »

Mar 2, 2024

The Paradox Of Time That Scares Scientists

Posted by in categories: cosmology, singularity

When time reaches its limits, scientists call those moments “singularities.” These can mark the start or end of time itself. The most famous singularity is the big bang, which happened around 13.7 billion years ago, kicking off the universe and time as we know it. If the universe ever stops expanding and starts collapsing, it could lead to a reverse of the big bang called the big crunch, where time would stop. As our distant descendants approach the end of time, they will face increasing challenges in a hostile universe, and their efforts will only accelerate the inevitable. We are not passive victims of time’s demise; we contribute to it. Through our existence, we convert energy into waste heat, contributing to the universe’s degeneration. Time must cease for us to continue living.

Page 4 of 8712345678Last