What ever happened with the singularity?
The notion of the singularity seems to have faded from public consciousness (Not that it ever was a mainstream concept) for no better reason than it seems to have become unfashionable. Many articles have been written dismissing the possibility but none ever seem to offer credible reasons why an AGI mediated intelligence explosion is impossible, often we’re supposed to take the author’s argument from incredulity as authoritative. But I do think there is one solid argument one can offer based on observational evidence.
And that observation is the Fermi Paradox. Where IS everybody? Any technological civilization could cover the galaxy in evolutionary time and yet we appear not to live in a universe where signs of large-scale intelligent activity are visible. The Fermi paradox asks why and where all the other galactic civilizations are, given the fact that any single intelligent species could cover the entire galaxy in a scant million years even with technology near to ours, even by passively migrating along stars as they pass near one another. And given a million year head start their technology would have to be that much more capable, right?.
In this context we usually conceive of the so-called great filter – reasons why a technological civilization does not go any further than our current single planet state. Naively, an AGI takeover could be considered such a great filter, but a moment of reflection shows that this is not the case. A planet earth post skynet or the machine takeover from matrix is no less a technological civilization than it was when the humans were in charge (Apologies for using popular movie examples). In fact it’s easy to argue that it is more so! A machine civilization could expand without regard for biological weaknesses, not requiring biospheres or delicate adjustments in living conditions. The second part of a singularity, the intelligence explosion, makes expansion even more probable because by definition a civilization 100 years more advanced than us would have to be, given an inevitable Singularity, just as advanced as a million year old civilization because the technological leap takes place at an accelerated pace during the intelligence explosion phase. When technology goes exponential 100 years may as well be a million.
Part of the assumption behind the Fermi paradox is that technological civilizations, acting in geological timescales can occupy an entire galaxy. The singularity hypothesis, on the other hand, predicts that any technological civilization changes state in a short period, soon after inventing artificial intelligence technologies. A singularity implies a technological progress asymptote. A 10 year old technological post singularity civilization and a 1000000 year old post singularity civilization are both equivalent in technology, both have reached the asymptote.
If such a phase exists, a singularity is no great filter at all, in fact it is the very opposite. And if a technological civilization inevitably heads towards a singularity, we should see signs of technological post singularity civilizations lighting up the sky – a biological polity may hesitate to dismantle it’s solar system to make a Dyson sphere, but a computronium based singleton or society of minds would not have any such hesitation.
Perhaps another term should be added to the Drake equation – percentage of technological societies that undergo a singularity. But.. wouldn’t that be all of them that do not blow themselves up in some equivalent to our 1960s Cold War? Isn’t the singularity inevitable once you engage in computer development? That is the assumption behind the notion of the impending singularity.
So, perhaps that notion is wrong. There is no singularity, because it would make the Fermi paradox even more paradoxical.
Or… perhaps dreams of galactic migration and Dyson swarms may be just like ants dreaming of giant anthills, and have absolutely no relevance to a proper advanced civilization. The singularity would be true change of phase and the old rules would not apply.
A very interesting paper explores precisely this scenario, with the hypothesis that advanced computer-based civilizations may by waiting for a cooler future universe where computation is more efficient:
That is not dead which can eternal lie: the aestivation hypothesis for resolving Fermi’s paradox (Read it!) While the reasoning for the paper is intriguing, I frankly don’t understand why the civilizations it proposes would not also be active during the hot phase of the universe, even if it’s not ideal for them. Surely securing their neighborhood before going into long term aestivation would make sense? But apparently the premise is that for these civilizations, information and computation is more important than energy.
In any case the paper makes the point that a post singularity civilization, whatever it’s form, would not have the same goals as a normal civilization as we understand it. So we can either assume that:
- A singularity does not make galactic colonization more probable.
- A singularity does not happen because it’s impossible
- Singularities happen and the end result is uniformly invisible to conventional searches because their technology is not visible at large scales. Does this tell us anything about the kind of technology that a civilization beyond ours would employ?
On this last point, the interesting fact is that if the singularity exists, the Fermi paradox seems to imply that it converges onto the same end whatever that may be…
One such convergence is given in the short story The Demiurge’s older brother by Scott Alexander (Also recommended reading). Briefly, Alexander suggests that newly arising super intelligences will spontaneously restrain themselves from excessive and overt expansionism to avoid zero sum destructive conflict with other potential super intelligences. This predicts that all existing singularities will arrive at a similar conclusion regarding expansionism and resource consumption.
Or, maybe technological singularities just aren’t a thing.
Frank J. Tipler arrived at the conclusion there is no one else out there through a similar reasoning