The subject of this week’s cover story, Dr. Terry Grossman, believes the human race is fast approaching the singularity, the point at which technology becomes so powerful it allows us to live forever and changes the fabric of existence to such an extent that it’s hard for people today to imagine what the world will look like. As nutty as this theory may seem, it’s one that’s caught on with many science and technology experts, most prominently the famous inventor Ray Kurzweil, who co-authored Grossman’s book Fantastic Voyage: Live Long Enough to Live Forever.
The singularity sounds great, but there’s a problem, writes author Karl Schroeder in a new article in the online mag WorldChanging. Future technological progress will be seriously curtailed by our destruction of the ecosystem. We’re screwing the pooch environmentally, writes Schroeder, so much so that even the super-fantastic computers promised by the singularians won’t be able to repair all the damage, if they’re able to be developed at all:
…this upward curve of technological development rides on something: it rides on the back of humanity, and we ride (largely for free, until now) on the back of the natural system that sustains us. Once serious environmental deterioration sets in, the curve of technological change will flatten, even if we develop 'godlike AIs,' for the simple reason that intelligence itself is not enough to sustain growth. You also need resources, externally-derived social stability, etc. Climate change threatens technological growth by threatening its fundamental drivers.
If you like this story, consider signing up for our email newsletters.
SHOW ME HOW
You have successfully signed up for your selected newsletter(s) - please keep an eye on your mailbox, we're movin' in!
If you’re the type of techno-geek who lovingly compiles essays like this in your Battlestar Galactica Trapper Keeper, here are other heady singularity treatises to set your steampunked heart a-flutter: “One Half a Manifesto,” by virtual reality pioneer Jaron Lanier, dismantles the theory of singularity by arguing that while computer hardware may be advancing towards mind-boggling power, computer software is becoming so bloated and wasteful that worldwide technological progress will soon stagnate. “In 20 years, we're talking about a planet of help desks,” writes Lanier.
Want another delightfully dweeby take on the singularity? Check out “Why the Future Doesn’t Need Us,” by Sun Microsystems co-founder Bill Joy. Unlike Lanier, Joy believes in the coming singularity – and it scares the bejesus out of him. When computers become all powerful, writes Joy, what’s to say they will help humankind, and not utterly destroy it? After all, we don’t have a good track record when it comes to respecting technology’s shortcomings: the scientists building the first atomic bomb theorized its detonation may set fire to entire atmosphere – but that didn’t stop them from setting it off. And the coming technological threats could make nuclear weapons look like squirt guns, suggests Joy:
I think it is no exaggeration to say we are on the cusp of the further perfection of extreme evil, an evil whose possibility spreads well beyond that which weapons of mass destruction bequeathed to the nation-states, on to a surprising and terrible empowerment of extreme individuals.
Where’s Sarah Conner when we need her? – Joel Warner