Thursday, October 25, 2012

Calculating the Beast

“A thing long expected takes the form of the unexpected when at last it comes.” ~Mark Twain

Across the world teams of scientists, mathematicians, and computer programmers have spent decades trying to create human intelligence in a machine. There’s just one problem: It’s not going to happen the way we expect.

In fact, true A.I., if we can call it that, will not emerge from an isolated research laboratory at the hands of a few scientists, but out of society itself; from the collective decision-making network we call the market.

The key to understanding how and why this monumental event will take place lies in an overview of complexity theory.

Complexity


In his book Currency Wars, Jim Rickards outlines four main principles underlying complex systems:
  1. They run on exponentially greater amounts of energy as they increase in size
  2. They are prone to catastrophic collapse
  3. They are not controlled from the top-down but evolve naturally from local interactions
  4. They have emergent properties, i.e. the “whole is greater than the sum of its parts”
 Rickards summarizes the first two points, currently on most people's minds, by saying,
“highly complex systems such as civilizations require exponentially greater amounts of energy inputs to grow, while [Joseph] Tainter shows how those civilizations come to produce negative outputs in exchange for the inputs and eventually collapse. Money serves as an input-output measure applicable to a Chaisson model because it is a form of stored energy. Capital and currency markets are powerful complex systems nested within the larger Tainter model of civilization. As society becomes more complex, it requires exponentially greater amounts of money for support. At some point productivity and taxation can no longer sustain society, and elites attempt to cheat the input process with credit, leverage, debasement and other forms of psuedomoney that facilitate rent seeking over production. These methods work for a brief period before the illusion of debt fueled psuedogrowth is overtaken by the reality of lost wealth amid growing income inequality.”
Once we reach this critical state, complex systems typically undergo some sort of change to their organizational structure. Rickards outlines three possible changes: simplification, collapse, or conquest. We can all agree that simplification through various measures is necessary but not likely to happen voluntarily. This leads us back to the dreadful collapse and then, perhaps, eventual conquest. Knowing which of these outcomes is most likely to occur is largely dependent upon the third principle: How does society adapt in response to crisis?

We might be tempted at this point to think that this means the response by governments and central banks, but according to complexity theory, complex systems are primarily driven by the bottom-up and not from the top-down. Does that mean governments and banks have no influence over the economy? No. They just can’t control its ultimate direction without expending huge amounts of energy and worsening the eventual adjustment.

How does this relate to the topic of the Emergent Market? In terms of our modern financial system, the dominating influence of algorithmic trading is now a major concern for regulators. The question we have to ask is, can this trend be reversed? If it is true that complex systems cannot be controlled from the top-down (due to their highly adaptive and unpredictable nature), we should expect financial technology to continue to evolve and produce further strange market behavior. For most, this means another flash crash or other disruptions, yet that is only one aspect that fails to recognize a much larger pattern taking place. That pattern, of course, is one of the most mysterious properties of complex systems called emergence.


So what is emerging from the market, you ask? To answer this question, I want to first revisit what I said at the very beginning of this article about scientists not being able to create human intelligence in a machine.

Given what we know about complex systems and their emergent properties, this should now be obvious. Human consciousness and intelligence are emergent properties of highly complex systems that cannot be designed in a single machine, but only emerge through numerous local interactions in a system reflecting the complex structure and collective behavior of humanity itself. Is that the market? I believe so.

Here's what it all boils down to: We have built a financial system using technology that is beyond human control. According to complexity theory, this problem will either lead, eventually, to catastrophic collapse or the emergence of a fully-automated self-regulating intelligence.

All in all, consciousness does not lie in a single neuron and neither should we expect to see it in a single machine. How many machines does it take then? Unfortunately, the problem with emergence is that you can't simply calculate when a large group of algorithms, animals, or a crowd of people will herd into a single mass. It just happens. That's the nature of the beast.

No comments: