All this decay is so incremental that nobody thinks it possible that it could ever accumulate into a risk that threatens the entire system.
The funny thing about risk is the risk that everyone sees isn’t the risk that blows up the system. The mere fact that everyone is paying attention to the risk tends to defang it as everyone rushes to hedge or reduce the risk.
It’s the risk that accumulates under everyone’s radar that takes down the system. There are several dynamics driving this paradox but for now let’s look at the paradox of optimization.
The paradox of optimization is that to optimize efficiency and output (i.e. profit) resilience must be sacrificed. This leaves the system vulnerable to collapse when the system veers beyond the parameters set in stone by optimization.
Resilience is the result of a number of costly-to-maintain features. For example, redundancy: to optimize the supply chain, get rid of the higher cost suppliers and depend entirely on the lowest-cost supplier. This sole-source optimization works great until the sole-source supplier encounters a spot of bother, or one of the sole-source supplier’s component manufacturer encounters a spot of bother. By the time the entire supply chain has collapsed, it’s too late to reconfigure a factory or increase the production of marginal suppliers.
The classic example of optimization and redundancy is a spacecraft. Oops, the oxygen valve just blew out. Install the replacement valve before we all expire. Oops, the spare valve was eliminated in the push to reduce weight.
Optimization assumes everything will work within very tight parameters. Maintaining redundancy, backups and adaptability is expensive and so as long as everything is working well then all those expenses viewed as unnecessary are cut to reduce costs and increase profits.
The same can be said of hedges. If the market only goes up, what’s the point of maintaining costly hedges against a crash that will never happen?
The longer optimized systems work as intended, the greater the confidence in the system. Since nothing has ever failed before, participants start taking liberties on the margins of the system, letting quality slip because quality control, training, etc. are all costly in time, money and effort. Since everything is working so well, why bother being fanatic about quality and risk mitigation?
This confidence feeds complacency and hubris. Since the O-rings have never failed, go ahead and launch. In our confidence and hubris, we stop paying attention to the limits of the optimized parameters: yes, the system works, but only if all the parameters hold.
While the crowd basks in complacency and hubris, risk seeps into the system in ways that few are paying attention to. Every optimized system invites taking liberties because pushing the boundaries doesn’t seem to have any downside. Consider subprime mortgages as an example: reducing the down payment required of home buyers didn’t break the system, so why not reduce it to zero? See, everything’s still working.
Since the system is so obviously robust, let’s dispense with verifying income of mortgage applicants and accept whatever they claim. And since the system is still working well with these tweaks, risk is obviously low so let’s rate all these mortgage pools as low-risk. And since the appetite for low-risk securities is so high, let’s bundle as many of these as we can and sell them to pension funds, etc.
Since everyone is only paying attention to what’s within the optimized parameters, the risk piling up outside the parameters goes unnoticed. The fact that the system hasn’t blown up yet induces a hubristic confidence bordering on quasi-religious faith that the system can absorb all kinds of sloppiness, fraud and run to failure dynamics and keep on ticking.
This substitution of faith for rigor goes unnoticed. Those few who are watching the build up of risk where no one else is looking are dismissed as perma-bears, Cassandras, etc. What’s passed off as rigorous (dot-plots, anyone? very tasty!) isn’t actually rigorous because what’s being paid attention to is all well known and well tracked.
What breaks systems is not well-understood. Once we start talking about non-linearity, phase shifts and incoherence then we’re in abstract la-la land. And so the human default is to place quasi-religious faith in the durability of whatever system has been optimized for specific conditions. When those conditions erode or change, nobody notices because the system is still functioning.
Once an optimized system is considered so robust as to be permanent, then participants start larding on mission creep and the corruption of self-enrichment. Nobody will notice if overtime is fudged, expense accounts padded, reports filled with copy-and-paste jargon, etc.
All this decay is so incremental that nobody thinks it possible that it could ever accumulate into a risk that threatens the entire system. Since the system is optimized for specific conditions and the feedbacks of resilience are viewed as unnecessary expenses (or given short shrift because they’re viewed as mere routine), the system veers outside its optimization parameters and begins orbiting the black hole of decoherence.
Systems that spiral into decoherence never emerge. The risk piled up outside of what everyone has been trained to pay attention to, and so everyone who reckoned the system was stable and durable is unprepared for both its unraveling and the speed of that unraveling.
Optimizing a system for specific outputs generates risk outside the reach of whatever shreds of resilience are left after years or decades of decay, corruption, self-service and mission creep. Coherence without resilience is illusion.
Only the few paying attention to the accumulation of risk outside what everyone else sees and knows see the last chance to exit. Everyone else continues playing with the iceberg’s scattered ice chunks on deck because everyone knows the ship is unsinkable because the risk has been mitigated with waterproof bulkheads. Only the few paying attention to the details understand the ship will sink.
Quasi-religious faith in the Federal Reserve is not a substitute for the rigor of paying attention to what nobody else is paying attention to.