And then, there’s always the weird and quite surrealistic end of humanity called ‘The Technological Singularity’. It’s a concept from computer philosophy, really. But we could be en route to making it real...


It should happen some time over the next thirty years. Suddenly, something very weird will happen. We will vanish into a maelstrom never experienced by any living creature before: the Technological Singularity. And the really bizarre thing is, we can’t tell you what it’s going to be like. That’s the problem with singularities: they are, by definition, out of reach of our imagination.

There are a few things we can be pretty sure of. Obviously, a thing called 'Technological Singularity' has something to do with technology. It should be dramatic, hefty and sudden. Oh, and you might be interested to hear it should wipe our kind off of the face of the planet.

‘Singularity’ is a word from mathematics. Roughly speaking, it is a place where mathematical order becomes chaotic, where regularity becomes illogical and unpredictable. You cannot even properly call a singularity a ‘place’. It is just that: a singularity. For example, a black hole is a singularity. In it, time collapses and the laws of physics no longer apply.

In the 1950s, mathematician John von Neumann predicted our society is destined to go singularity, too. His line of reasoning was simple: we’re inventing things at an ever faster pace. Faster and faster it goes, until one day, the trend explodes. We will invent new things infinitely fast. We and our inventions will... ehm, go out of control.

Read how novelist and mathematician Vernor Vinge put it in a famous 1993 paper: "This change will be a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control." And: "We are on the edge of change comparable to the rise of human life on Earth."

If you think that sounds pretty vague -- we agree. Gladly, Vinge also helpfully pointed out a few ‘omens’ that should announce the Singularity. You can see some of them around already. As Singularity Day comes, computers get ever faster, and huge computer networks arise. More and more tasks are automated. Information and ideas spread ever faster. Slowly, humans will begin to merge with their computers, finding new ways to connect our minds to our computers directly. We might even genetically engineer ourselves, or find other ways to upgrade our intelligence.

And then? Next thing you know, some kind of superhuman consciousness awakens. Out of the computers, and out of us, ‘something’ arises. It is something superhuman; something super intelligent. Before you know it, It begins spewing out ideas and theories, infinitely fast. Also, it finds ways to make these ideas real at a similar, infinite speed.

Most likely, you're sucked up in the process, in some unimaginable way. Around 2030, you're tied and bound to the computer, remember? You'll have an interface in your brain to control your computer, or your brain is  a computer. So probably, your mind suddenly goes Singularity, too.

And after that.. Well, we can’t tell. Perhaps we will leave our bodies. Maybe we will merge with our computers, break down the barriers of our individuality and become a train of thoughts. Maybe we will become a duck-shaped cloud of purple gas, who knows? Maybe we will overcome the restrictions of the dimensions, and become God. Now that would be pretty strange indeed, wouldn’t it?

To us, living beings, the Singularity should be bad news. After all, if Von Neumann and Vinge are right, there won’t be any  humans around anymore in 2030. More likely you will find an empty planet. Deserted buildings, empty streets. Perhaps you won't even find a Universe anymore. We even might  have terminated the Universe, while going Singularity.

But at least, will it be good or bad? Well -- guess what: we can’t tell. In a Singularity, ‘good’ and ‘bad’ probably will have lost their meaning. Our kind could end up in a state we today-lings would call ‘god-like’. Or, depending on your point of view, ‘insane’.


So... Can we avoid it?

Well, maybe we can. For one thing, there’s always the possibility that Vinge and Von Neumann have got it all wrong. Maybe there won’t arise any superhuman consciousness out of our computers. This is how the great mathematician Roger Penrose saw it: in his view, machines simply cannot become conscious. Period.

What’s more, instead of accelerating, our technology might in fact be slowing down. In 2005, physicist John Huebner studied the rate of technological innovation, by counting the number of patents and major inventions per world citizen. The outcome: the speed of innovation is declining! We’re simply running out of new ideas. We did the major inventions like the telephone and the airplane already -- and now we’re just doing the small stuff, making minor adjustments to our big ideas.

Even if we’re heading Singularity, we may be able to isolate the danger. Or hand our computers down a strict order: "Whatever you do, computer, don’t mess with humanity."

But actually, we at Exit Mundi aren’t too sure that will help. After all, a superhuman intelligence is no match for an ordinary human being. Most likely, it will drag our kind along, into the Singularity. Hey, it might even be fun! Think of it this way: at least we won’t be bothered by minor little discomforts like death, when Singularity Day arrives and we become 'Post-Human', or more of that vague stuff.

On the other hand, as you can read elsewhere on this site, there are some hurdles to take first. For one thing, our technology could turn against us beforehand, turning us into the Borg, or ridding the planet of humans. Why, after all, would a superhuman intelligence take us into account any more than we humans feel sorry for a microbe?

Now that would be a bummer. Finally, we’re becoming God -- only to find ourselves killed by our machines, only moments before we succeed.

 


LINKS OUT:

Vernor Vinge, "The Coming Technological Singularity"

Answers.com: "The Technological Singularity"


PAPER LINKS:

Justin Mullins: "Whatever happened to machines that think?" In: New Scientist, April 2005

Ray Kurzweil, "When Humans Transcend Biology" (2005)

All texts Copyright © Exit Mundi / AW Bruna 2000-2007.
You're not allowed to copy, edit, publish, print or make public any material from this website without written permission by Exit Mundi.