A couple of years ago I stumbled on a really weird webpage. It was part of some kind of discussion forum , and they seemed to be talking about massive changes in the world. It truly had me mystified for a while, as if I had encountered the kind of shadowy powerful organisation so popular in fiction.
The organisation isn't exactly shadowy , but it isn't particularly well-known. It is the Singularity Institute . The best way to explain them is to quote from their homepage:
What is the Singularity? Sometime in the next few years or decades, humanity will become capable of surpassing the upper limit on intelligence that has held since the rise of the human species. We will become capable of technologically creating smarter-than-human intelligence, perhaps through enhancement of the human brain, direct links between computers and the brain, or Artificial Intelligence. This event is called the "Singularity" by analogy with the singularity at the center of a black hole - just as our current model of physics breaks down when it attempts to describe the center of a black hole, our model of the future breaks down once the future contains smarter-than-human minds. Since technology is the product of cognition, the Singularity is an effect that snowballs once it occurs - the first smart minds can create smarter minds, and smarter minds can produce still smarter minds.
To quote Eliezer S Yudkowsky:
The Singularity holds out the possibility of winning the Grand Prize, the true Utopia, the best-of-all-possible-worlds - not just freedom from pain and stress or a sterile round of endless physical pleasures, but the prospect of endless growth for every human being - growth in mind, in intelligence, in strength of personality; life without bound, without end; experiencing everything we've dreamed of experiencing, becoming everything we've ever dreamed of being; not for a billion years, or ten-to-the-billionth years, but forever... or perhaps embarking together on some still greater adventure of which we cannot even conceive. That's the Apotheosis.
If any utopia, any destiny, any happy ending is possible for the human species, it lies in the Singularity.
There is no evil I have to accept because "there's nothing I can do about it". There is no abused child, no oppressed peasant, no starving beggar, no crack-addicted infant, nocancer patient, literally no one that I cannot look squarely in the eye. I'm working to save everybody, heal the planet, solve all the problems of the world.
So if the singularity occurs in this way,this could be the last invention humanity has to make. What's more , many of the people involved in the Institute seem to be transhumanists, who can be defined as follows:
Transhumanism is a way of thinking about the future that is based on the premise that the human species in its current form does not represent the end of our development but rather a comparatively early phase.For example, we might be able to upload our minds into some kind of computer , that allows us to have more neurons, and be more intelligent. (note this computer could be installed inside our own bodies)
Once we have improved ourselves, we might be so changed that we might be best considered to be 'posthumans'. To quote the World Transhumanist Association:
Posthumans could be completely synthetic artificial intelligences, or they could be enhanced uploads , or they could be the result of making many smaller but cumulatively profound augmentations to a biological human. The latter alternative would probably require either the redesign of the human organism using advanced nanotechnology or its radical enhancement using some combination of technologies such as genetic engineering, psychopharmacology, anti-aging therapies, neural interfaces, advanced information management tools, memory enhancing drugs, wearable computers, and cognitive techniques.
Well, that's a lot to think about. But if you want to look further try some of the Shock Level 4 pages here, and here.
To get a feel of what future society might be like , have a look at the Orion's Arm SF setting here.
So the answer from the singularitarians and transhumanists is - "There may not be many humans around in 100 years time - we will have moved on to being something better". Of course, they could be wrong.