Last week I had an e-mail discussion with blogger
Chell about transhumanism, which I thought was interesting enough to adapt into a posting. Her comments are in italics, mine are in regular print.
Isn't human intelligence just what it is? It's like kids going to school. They gain an education, but it does not increase their intelligence. Neither does each passing year. A person's intelligence, regardless of eduction or life experiences is at a pretty set level, is it not?
Human intelligence has been creeping upward for a long time -- I think the average IQ in the US has risen 18 points since 1950 or something like that. It's the result of things like improved health and nutrition, healthier mothers having healthier babies with healthier brains, which is cumulative from generation to genera-tion, the way each generation of children is growing up a little taller than their parents. What I was referring to for the future is more fundamental. Based on the current rates of improvement of our scanning technology and computer processing power, the functioning of the brain should be thoroughly understood by about 30 years from now at the latest, and we'll be able to start integrating nanocircuitry into it to enhance its abilities (we're already enhancing the brain with artificial processors in very crude ways, as with the implants for Parkinson's disease). By around 2045, machine intelligence (that which we now think of as "computers", though it will be far more sophisticated by then) will be fully integrated into the human mind. Imagine being able to access all the information on the internet in exactly the same way that you can recall facts or images in your own memory, being able to "think" with the same speed and precision with which computers can process information, etc., while still having your normal human abilities and personality. Organic brains are very good at certain things, but they do have their limitations. We need to overcome those in order to reach our full potential. As Vernor Vinge (one of the first proponents of the concept) said, we today can no more imagine the cultural and technological flowering that will follow the Singularity than a flatworm can imagine an opera.
Have been thinking about how you said each generation is more intelligent than the last, like each is taller. I can only wonder how many new generations there would be, if people were able to live forever. We couldn't go on endlessly multiplying, packing our-selves on this planet (or even others we might make habitable) like sardines. So there would eventually be no new, more intelligent generations.
On this I'll fall back on Aubrey de Grey's formulation: certainly a cure for aging would create problems, but it would not create any problems as bad as forty million people a year dying of a
ghastly, wasting disease, which is the current situation. Birth rates are declining all over the world anyway; they're below replacement level in most advanced countries. Once molecular manufacturing becomes widespread in another two decades or so, we'll see an explosion of material wealth which will dwarf that produced by the industrial revolution. We'll be able to accommodate any population growth resulting from the end of aging very easily, while eliminating most of the damage our industries now do to the environment.
Before we have the technology to do real machine-mind integra-tion, we'll have widespread full-immersion virtual reality (that is, computer-generated environments indistinguishable from reality to all the senses), and I think that after the Singularity, most human activity will migrate into virtual reality quite rapidly. Billions of people will be able to live like Louis XIV if they feel like it, without occupying any physical space in the "real world" or using any resources except computer processing power. So I'm not worried about overpopulation. As for continuing to produce new generations, that will be a matter of individual choice, as it is now.
Sure, we could add computers to our brains (those of us who would be willing, anyway). We make artificial hearts and limbs for people who need them now. In my opinion, there's a difference between heart failure or loss of limb and having a normally functioning human brain. Who or what would make these computers/chips/wires that would "enhance" our brains? If humans made them, there would be human error and glitches. Would you like to be happily zipping around in a Jetson's style saucer, and all of a sudden have your brain freeze except for a box that has "continue" and "cancel" buttons, both of which do nothing? And we still wouldn't be more intelligent. We'd just have an artificial something placed in our heads.
Adding the capabilities of computers (much more advanced than present-day computers, remember) to our existing human intelligence would greatly increase our intelligence by any reasonable definition, it seems to me. There's very little anatomical difference between a human brain and a chimpanzee brain except that the human brain is about three times bigger -- it has more neurons, more synapses, more processing power. That's what accounts for the fact that humans are more intelligent than chimpanzees. Mind-machine integration would increase the processing power of the mind far more -- not just three times, but ultimately by however much any given individual wanted for any given purpose.
We're already using machines to enhance our thinking -- consider how much faster and more accurately we can get information on the internet or do complex calculations on a computer than we could if we relied purely on our own brains. The problem is that we're still stuck with accessing that computer intelligence through slow, clumsy interfaces of keyboards and monitors, instead of directly. That's what will change.
By the way, I'm not primarily talking about surgically implanting electronics (more likely nanocircuitry, by then) in people's heads, though I suppose that might happen as an intermediate stage. Machine-mind integration will ultimately take the form of mind uploading. If you haven't read it already,
this (rather long) posting explains that.
As for reliability, the question to me is not whether using nanocir-cuitry as the physical substrate for a human mind would be abso-lutely reliable, but whether it would be
more reliable than the organic substrate (the brain) that the mind is dependent on right now. The organic brain has many reliability problems -- it's made of fragile material, it's ill-designed for backing up information so it can be restored if lost, its efficiency depends on a constant supply of various nutrients, something as trivial as five minutes' oxygen deprivation can cause an irreversible total system crash, etc. Computer systems with extremely important functions can be made very reliable. Systems which had our own selves "running" on them would be made the most reliable of all -- certainly far more reliable than the brains we have now.
Maybe most of the world will be like that someday. To me, it's sounds like the most frightening situation imaginable. It sounds like people would attempt to cast aside their humanity to become robots. I would rather not be here for that.
I've never understood this reaction, unless it's the cumulative product of endless unimaginative treatment of such themes in science fiction. There's no reason why increasing human intelligence would make us less human -- less emotionally sensitive, humane, or whatever other attribute one might consider to be part of that "humanity" to be "cast aside". Just the opposite. I think our emotional and aesthetic lives are much richer and subtler than those of chimpanzees,
because of our increased intelligence. I don't think we've lost anything by becoming three times as intelligent as chimpanzees. That's how we
gained our humanity. Becoming even more intelligent will enhance our humanity, not cause us to lose it.
The only thing we stand to "lose" by achieving the Singularity is our limitations -- being limited to a certain maximum life span or a certain maximum range of mental capabilities allocated by the blind processes of evolution. The idea of defining the essence of "humanity" by its limitations seems awfully -- well -- limiting.
Paganism is such a blanket term. All sorts of Pagans have all sorts of views on life and an afterlife, and on immortality. I believe we are all immortal already. Even looking at it in the most basic way, when we die, our bodies decompose. But we don't just disappear. Our bodies merge back with the elements of Earth. We continue in this way, at least. If I believed that's all there was, that would be fine. But I believe something must also become of our spirits. That energy must go somewhere. I do believe in reincarnation, but I also think we don't all come back to the world we can most easily see right now. As far as a man-made immortality, I think that's up to each individual, if he wants to attempt that or to live that way. The progress of science shouldn't be held back, since facts are not "bad" things.
Well, of course, I don't believe I
have a "spirit", so that's a moot point. The physical material my body is composed of is not the essence of me either; every molecule that was in my body 20 years ago has been replaced by now, but I'm still the same person. So if I die, I don't much care what happens to the organic material -- I'm still dead, regardless. As for reincarnation, if I die and another entity is born somewhere which is supposed to be a continuation of me, but that entity doesn't
remember being me, then it
isn't me. I'm still dead.
I do find it curious that many people who say they would not want to live forever find the idea of an afterlife attractive. They think they would be bored in an ever-changing, ever-advancing civilization whose culture we today (as I said) can no more imagine than a flatworm could imagine an opera, but they would be just fine "living" forever in a static Heaven.
I think deep down almost everyone actually wants to survive. Look how eagerly the mass public mind laps up stories like Bridey Murphy which supposedly provide evidence for reincarnation, even though reincarnation is a heresy from a Christian viewpoint. They want to believe existence can go on. It's just that the idea of achieving this through technological means in the real world is so new and unfamiliar.
There may well be people who want nothing to do with life extension or virtual reality or intelligence enhancement or whatever -- though I don't think there will be many, once it becomes clearer what it all actually means. We have people like the Amish right now. Nobody bothers them, but there aren't many of them, and they win few recruits from the mainstream society.
Evolution has produced many remarkable things, but it is a very cruel process. In a sense, what humans are starting to do is to
take charge of that process, to start directing it consciously rather than merely being the victims of its blind operations. I hope you can see it that way.
Through evolution the mindless universe has finally produced something greater than itself: the mind. Now the mind will take charge, and ultimately spread itself throughout the universe, remaking the universe as it sees fit.
Labels: Freedom, Technology