Some Pages

Friday, March 12, 2010

Misconceptions on Singularity

Several weeks ago (yes, I’ve been procrastinating), I attended a philosophy dialogue on the topic of the technological singularity. The singularity is a rather fun concept for me, especially when combined with my love of trans-humanism. I was first introduced to the concept through a webcomic; namely Dresden Codak (The story arc concerning the singularity can be found here), which is ironic given that much of what I am going to say is in contradiction to what the comic says (Also, intelligent as Kimiko is supposed to be in the comic, she apparently slept through the evolution part of her biology class).

So, what exactly is the technological singularity? The phrase comes from the physics term “gravitational singularity”, which refers to the point where the measurements of a gravitational field become infinite and our conventional understandings of it breaks down. The technological singularity is a point where our scientific advances have continued to accelerate, and reached a point where the rate of discovery makes it impossible for us to predict the future based on our conventional understandings of the world. The theory is based in a concept by the British mathematician I.J. Good, who put forth the idea that, if we were to create an intelligent machine, that machine could create a better version of itself, which could create a better version of itself, and so on, until we had machines capable of massive amounts of intelligence. The main point of proof people point to behind this is the fact that new technologies are being discovered at an increasingly fast pace, in a seemingly exponential curve compared to the past. Just think of how quickly the fancy new PC you bought is replaced by an even fancier, newer PC nowadays. According to most singularity theorists, the Singularity lies at a point when the curve has become nearly vertical, and the rate of technological discovery becomes almost instantaneous. Speculation on what a post-singularity society could achieve often sounds similar to a description of gods; complete control over the weather, eradication of diseases, immortality, transcending humanity (or humanity becoming obsolete, in the less optimistic predictions), etc. The abilities of a post-singularity society are apparently limitless.

Yet there are some problems with that concept of singularity. Science, for one thing. An increasing number of people are claiming that a technological singularity, where we will ascend beyond our feeble mortal forms and become godlike beings, is inevitable, yet they are ignoring many important facts about it. There is an upper limit, based on our current technologies, for computing power before we reach a point where the computer cannot dissipate heat fast enough and literally melts. Energy requirements are another issue; there isn’t enough energy in the world to power the kind of things these singularity theorists claim, especially if we were to use machines, as most theories do. Electrical power is not the most efficient source of energy, especially when compared to much more efficient organic systems. We’re not that far away from reaching the upper limit to how much energy we can store in a battery, and that limit is certainly not enough to power a singularity. And this isn’t even going into how many of our current energy sources are non-renewable. Now, yes, all of this can supposedly be hand waved away by saying we will create the technologies to get over those barriers, and that it’s impossible to predict anything post-singularity using our current understanding of the universe, but therein lies the problem: It’s not science. One of the most important aspects of a scientific hypothesis or theory is that it is falsifiable. The claim that any criticism we can think of to the singularity isn’t relevant because we cannot predict past singularity is just a way to refute all opposition. You can point to as many things that you claim support your hypothesis, but unless the hypothesis is able to be tested, and can be falsified, it is not a scientific hypothesis. That claim turns singularity into more of a religion than a science; yet it continues to be put forward as science by huge numbers of people.

Another, somewhat minor, point here that bothers me is the constant claim that the singularity is inevitable. This to me is what gives belief in this singularity theory the most resemblance to a religion than anything else; many of its supporters claim that the singularity is going to come, no matter what we do about it. This annoys me to no end. At least in religious apocalyptic scenarios, there is a reason behind their belief in inevitability: some higher entity beyond the control of humans is the one controlling things which lead to the event. Yet singularity can only be brought about through human actions; it makes absolutely no sense at all that it would be inevitable. Say one day Russia decided it was bored, and decided to just nuke the entire world for the lulz. Not going to get your singularity then, huh? Or an asteroid comes by and collides with the planet. Or, hell, the entire population of the human race decides eating cookies is a better way to spend their time than being on a computer, and the computer market collapses overnight. Belief in the inevitability of singularity is a contradiction in itself; they are making a definite prediction about the future, and yet according to them, you cannot make predictions about the singularity. There is also this strange, rock solid belief in so many of them that the singularity will certainly occur during their lifetimes, which I just cannot fathom.

Now, this may sound strange after giving all this critiques, but I believe in the singularity. Just not that singularity. This is a strange case, where the more extreme view is the one with the most adherents, and the more reasonable theories are in the minority. The problem with singularity is that many people have put together the same evidence (increasing rate of technological discoveries, fancy new AI inventions, etc.), come to similar, but still slightly different conclusions, and all given their theories the name “singularity”. I believe in, special thanks to TVtropes for giving me this term, a “soft singularity”. This usually has a more generalized meaning of singularity, where rapid technological discoveries radically alter society in ways that we could not predict. The invention of the printing press could be seen as one such singularity; it created an intelligence explosion, and led to changes which someone before its invention could have never foreseen. I’ll steal another quote from TVtropes here: “As I see it, the main problem in designing a plausible 23rd century these days isn't lack of grandeur, it's the imminence of changes so fundamental and unpredictable they're likely to make the dramas of 2298 as unintelligible to us as the Microsoft Anti-Trust Suit would be to Joan of Arc." Yes, we will discover new technologies in the future, many of which will probably do things that seem impossible to us today. Yes, our way of life will be changed in unpredictable ways. Will we become gods? Sadly, much as I want to be one, that part is unlikely, at least in our lifetimes. I do not outright claim that such a singularity is impossible; yes, we could indeed make the discoveries that allow us to ignore the obstacles facing it. On the other hand, no, we may not; getting around them might turn out to be impossible. It’s good to work for new discoveries, and transcending beyond our current human limitations, but one must keep in mind that the singularity is not some inevitable, miraculous event that will solve all our problems. It’s still our job to solve our own problems, singularity or not.

1 comment:

  1. "This is a strange case, where the more extreme view is the one with the most adherents, and the more reasonable theories are in the minority."

    Doesn't sound that strange to me...

    A good read, wish I could have been at the lecture myself. Our lectures are on boring stuff that I can figure out on my own anyways, like "Is the Obama administration's foreign policy successful?" And as a fellow Transhumanist, I'm definitely in support of super awesome technologies to make me a death dealing cyborg ninja were-shark. Definitely agree though, guessing anything about the point where we can stop giving guesses as part of its own definition is kind of silly. Technological progress is technological progress. Something could happen tomorrow that radically changes the way we understand the world. Facebook, iPhones (and the rest of those accursed swiss-army-cellphoness) already dramatically change things on a micro scale. Who's to say when the next big thing does happen? Or if it happens at all?

    ReplyDelete