Acculation
Talk with Ivy League PhD data scientists...
Internet map: network data visualization...
A Visualization of Wikipedia Data...
Streamgraph: multidimensional data visua...
Data viz: scoreboards as the original an...
Evolution of historical notions of Earth...

Why the Singularity Might Not Happen: Predictive Apocalypse Analytics

The Singularity is the point at which computer intelligence exceeds human intelligence, and the future course of humanity becomes more difficult to predict. Photo: Daniele Zedda/Flickr/CC-BY-2

Singularity or Apocalypse?

We’ve mentioned the concept of the Singularity a few times in these pages, but it’s not a foregone conclusion that this is the destiny of humanity. The NYPost did a sensationally-titled review last week on Eric H. Cline’s book about the fall of ancient Egypt. (We can’t quite resist including this work of art of an article plug line: “Ancient civilizations fell almost simultaneously & it could happen again.”)

Although Cline’s book (and the NY Post review of it) are interesting and thought-provoking, there are better ways, more analytical ways to think about the repeated suddenly collapse of past civilizations.

Required reading for mathematical archaeologists … and predictive modelers

We have two recommendations.

The first off is physicist Geoffrey West’s classic Ted Talk, “The surprising math of cities and corporations.”

Although Prof. West explains his observations on scale in a simple and straightforward way in a short ten-minute talk, his insights have profound importance on everything from:

  1. neuron speed in elephants vs. humans (why brain/body ratio, or rather brain-to-species-average-body-mass rather than brain size approximates a species’ intelligence)
  2. Determining the right police department staffing for a given city. This might be useful in the long-term GPD model we discussed in our earlier post based on something like the Social Progress Index. Investment in security services like police is one input in this model.
  3. Mergers & IPO theory: determining when a corporation has grown too large
  4. And, yes, why Apocalypses happen and vast civilizations suddenly vanish — in a concise, mathematical way.

Here is his acclaimed Ted Talk:

And our other recommendation is Joseph Tainter’s tour-de-force, The Collapse of Complex Societies. Tainter practically invented the field of mathematical archaeology, and provides a simple but persuasive mathematical model that unifies the collapse of many ancient civilizations. As these societies grew, due to the classic economic diminishing returns curve, they gradually became less efficient until they suddenly collapsed from exhausted resources. (Tainter discusses climate change and other factors as well, but argues that the sole purpose of government is as an insurance policy against systematic risks — something private insurance cannot handle — an a functioning government should have been able to handle crop failures as these kinds of systemic risks were its reason for being.)

Tainter is a bit of darling of some libertarian thinkers, although we understand Tainter is, himself, not a libertarian. If you start to talk about collapse of civilizations, you see why this might be: if government is inherently unstable, then you want to minimize reliance on in favor of individualism, privatization, or more regional/federal governance.

And, indeed, the Cline review mentions ancient greater privatization as a consequence of collapse (and its opposite, an ancient trade embargo as an early form of collective security by ancient alliance as an alternative to military action).

NGOs and private enterprise increase social stability?

Not much mentioned (to our knowledge) is the history of the rise of ancient NGOs in Europe during the repeated waves of governmental collapse in the wake of fall of the Roman Empire. Renaissance Europe, of course, also features some of the first multinational private enterprises and early corporations, and it seems quite possible that Europe’s experience with government and regional fractiousness may have helped to motivate these developments.

Concepts like diminishing economics return are important part of analytics, including the ones we discussed in our Social Progress Index article. But we do not see Tainter’s formulas as the knee-jerk ideological excuse for reducing all government expenditures to the benefit of some special interests that it is often used as. (Indeed, as our previous article makes clear, even some prominent conservative economists feel the data suggests we are not making enough social investments, a viewpoint opposite to their party line.)

Analytics is about optimizing results: finding the right corporate spends. Or, in the case of a nation, finding the optimal level of social and infrastructure investments that optimize a suitable metric such as long-term GDP (see our article).

Although ideologues may triumph when it comes to fundraising or garnering media sounds bites, they are seldom correct. The optimal answer is rarely to abolish an entire function of government, or, alternatively, to increase spending far above historical norms. Circumstances constantly change, and any given nation is likely to above optimal spends in some social investments and below optimal spends in other areas of endeavor.

The goal of analytics is to achieve consensus among stakeholders through plausible modeling supported by sound science and data. When this is successfully achieved, profound efficiencies result through the elimination of political infighting and optimal use of resources. This sometimes happens on the grand scale of billions or trillions in savings to a large corporation or nation.

Tainter, writing in the pre-Internet area, believed his data showed technological progress was slowing down. He believed this was result of this same diminishing returns phenomenon that affected all other areas of human endeavor.

Technology actually accelerating, not slowing

This he got wrong, at least in the near-term. In fact, technological progress in computers and all areas of technology that benefit from computers (i.e., nearly everything) is accelerating at an exponential rate. The reason is simple: each generation of computers designs their successor, which is thus able to benefit from technological gains in the prior generation. As a result, each generation of computers is a multiple more powerful than the previous generation, resulting in an exponential acceleration of technology. Futurists like Kurzweil who we have often mentioned in these blog pages believe this will lead to the Singularity, the point at which computer intelligence surpasses human intelligence, and the future course of humanity becomes difficulty to predict as a result. (It is not clear if SkyNet takes over, we become bionic cyborgs, or something else entirely happens.)

The only remaining question is whether there are fundamental limits on this exponential growth. For many decades, it was widely believed that physical limits would one day halt Moore’s law (the formula that predicts a periodic doubling of computer power ever few years that has guided Silicon Valley ever since it was first formulated decades ago). It is now thought that these physical limits are several decades away, so it may be possible to achieve the Singularity.

However, there may be other limits. Although’s Tainters diminishing returns do not apply to computer technology, they do apply to some of the raw materials. Tainter has written about the possibility of peak oil, or more ominously, peak Rare Earth materials (used to make computer chips).

Peak Rare Earth?

We should clarify that, as ambiguous and frightening as the Singularity is, not reaching the Singularity (or not continuing on our civilization’s growth curve) are equally frightening. Both Geoffrey West’s Ted Talk and Tainter show the mathematics of sudden civilization collapse that has occurred many times throughout history (and is referenced in some religious texts). This collapses are almost always very bloody. The difference today is that we have truly fearsome weapons, which we did not possess the last time civilization abruptly ended.  It is concerns like these that are at least partially fueling Silicon Valley’s desire to mine asteroids for semiconductor materials and found space colonies.

So, it’s not clear what will happen. We may continue for a long time on the exponential curve that Geoffrey West shows in his talk, exploiting space to find the raw materials needed to continue ever more ambitious technological advances. We might reach the Singularity. Or we might not, and that might be a much worse fate.

Photo credit: Daniele Zedda/Flickr/CC-BY-2

Next steps: Check out our YouTube channel for more great info, including our popular "Data Science Careers, or how to make 6-figures on Wall Street" video (click here)!