The Singularity, also known as the nerd rapture, can be a bit of a mystery to some, well no longer because the Singularity Institure, has published a Singularity FAQ (think of it as Singularity 101).
Is there anything skeptical here? Well yes, but I’ll come to that in a minute.
First, some basics (from the FAQ of course) …
What is the Singularity?
There are many types of mathematical and physical singularities, but in this FAQ we use the term ‘Singularity’ to refer to the technological singularity. There are three distinct ideas someone might have in mind when they refer to a ‘technological Singularity’:
- Intelligence explosion: When humanity builds machines with greater-than-human intelligence, they will also be better than we are at creating still smarter machines. Those improved machines will be even more capable of improving themselves or their successors. This is a positive feedback loop that could, before losing steam, produce a machine with vastly greater than human intelligence: machine superintelligence. Such a superintelligence would have enormous powers to make the future unlike anything that came before it.
- Event horizon: All social and technological progress thus far has come from human brains. When technology creates entirely new kinds of intelligence, this will cause the future to be stranger than we can imagine. So there is an ‘event horizon’ in the future beyond which our ability to predict the future rapidly breaks down.
- Accelerating change: Technological progress is faster today than it was a century ago, and it was faster a century ago than it was 500 years ago. Technological progress feeds on itself, leading to accelerating change much faster than the linear change we commonly expect, and perhaps change that is faster than we can cope with.
These three ideas are distinct, and might support or contradict each other depending on how they are stated. In this FAQ we focus on the intelligence explosion Singularity, which allows for easy discussion of the other two Singularity ideas.
OK, now on the the bits of skeptical interest. While at TAM last year, I discovered (much to my surprise) that Michael Vassar was also there, not as a speaker, but just to hang out with other skeptics. He is SI President, and is also responsible for the organization of the Singularity Summit.
As for the Singularity Summit, during the last meeting in August 2010, James Randi was one of their speakers, and lectured to a packed room with a talk on the practical need to rely on your own intelligence and critical thinking abilities in order to make sense of expert consensus and to build a realistic understanding of the world.
Other well-known skeptics such as DJ Grothe also hung out there.
Ah, so that’s OK then, it has received the blessing of well-known skeptics so we can deem it to be on the “approved” list … er no, and this is a key point. never ever accept anything no matter who has deemed it to be OK. Instead, be an independent thinker and form your own view and opinions.
All in all, there are some very interesting ideas floating about, but is there data to support it? can you really extrapolate forwards in a linear fashion as Kurzweill does? Some well-known skeptics don’t think so, and I must admit, they have a point, and yet I do confess to being sympathetic to the ideas myself because they are indeed so compelling.
As for some parts of the new FAQ …hummm … well, it feels like a mix of science extrapolated out beyond the boundary into science fiction, but then we do move forward by stretching our imagination into new territories, then reaching out to make it reality, so why not dream dreams.