Monday, October 06, 2008

Endogenous Failure in Complex Systems

It is difficult to go back, and understand how foolish we were in the past on various fads. To take an obvious example, why someone would give a $300k loan to someone with no money down, no verified income, on an 800 square foot house in a very bad neighborhood in Los Angeles. What were they thinking? A similar thought happens when one looks at the business model of internet companies in 1999. Whence the madness?

First, in spite of the slew of people who suggest they saw this all coming, we should admit it was generally unanticipated. To state that housing prices can’t rise at their current rate forever, or even of the risk from Fannie Mae without focusing on credit, implies they missed this crisis because without specifics warning cries are not actionable (eg, Richard Clarke’s vague warnings about Al-Qaeda prior to 9/11 were ignored). Robert Shiller's concern for housing in his 2005 revision of Irrational Exuberance amounted to a vague forecast that one could apply to anything: it's recent rapid increase was unsustainable. Greg Mankiw's concern downplayed the credit risk, emphasizing the interest rate risk, and the problem in incentives. The general view from academics was that these mortgage innovations (smaller down payment, lower credit scores) were not increasing risk, and those who said otherwise were in highly neglected minority.

Thus, the specific focus on Fannie and Freddie missed the real risks, which were in credit. The risk from moral hazard, emphasized by Mankiw, were then thrown in a general debate over regulation and government meddling. I don't see how if this debate had 10 times the resources applied to it, it would have changed what has happened. As Warren Buffet noted, the OFHEO, staffed with over 200 earnest workers, was basicall directed at regulating Fannie and Freddie, and they missed both the accounting fraud in the 2002 period, and this latest disaster.

To think this was obvious back in 2005 without hindsight, leads to a misdiagnosis. I'm not saying nobody saw this coming, only that those who correctly foresaw the crisis and its origins were heard, but dismissed by those should have been interested, and includes no one very famous (eg, Stan Liebowitz, David Andrukonis). I never even heard about these trends, but as I was not an investor in anything related, it wasn't something I cared about.

A useful analog was the Challenger space shuttle disaster, which blew up in 1986 because the O-rings were frozen at liftoff, and did not seal properly. With hindsight, this error was obvious, theoretically and empirically this problem had been identified by many in the large organization responsible for Shuttle launches. While this risk was designated a ‘launch constraint”, meaning its failure would kill the shuttle and must be addressed, in the context of the many launch constraints being overridden it did not seem so critical. The Shuttle ‘worked’ in the first flight even with 131 launch constraints violated, and after several flights many of these risks were deemed under control. They did not eliminate the risks, they just kept reclassifying their materiality over time, because you can’t argue with success.

In any complex system, its mere presence suggests it is somewhat battle tested. Often a risk limit can be violated yet the system works, because of offsets elsewhere in the system, or the fact that a risk is a function of time such that bad things happen once a generation. If the system is successful, in terms of shuttle flights or mortgage default rates, it doesn’t matter what your ‘theory’ is as to why certain risks are too great—-these risks will be explained away because in any complex system, the theory as to how one thing affects the entire system is tenuous. There are too many interest groups benefiting from the systems current state, and they will find good reason to dismiss concerns as evidence of envy, selfish interest, ideology or muck-raking sensationalism. The larger the system, the more resilient it is to criticism of any sort.

A good current example of a trend that cannot continue, yet there is no data against it, is government debt. While the official debt-to-GDP ratio is manageable, about 26th or so worldwide. But the off-balance sheet liabilities, thing like Medicaid, Social Security, increase our debt 5 fold (to around $60 trillion, compared to on-balance sheet debt of 10 trillion.). Ever since the passage of the unified budget act during the Nixon administration, the government has had the privilege of spending the Social Security funds by transferring the money into the general fund, from which Congress can spend on whatever pork projects they wish. Many government entities, city and state, keep increasing their off-balance sheet liabilities at a rate that implies preposterous tax rates or reneging on promises, but no one worries because this has been going on for a while. You would go to jail if you did this in the private sector, yet it is OK because it seems to work. Stein's law states that trends that can not continue, won’t, which implies the government will either have to default, reneg on benefits, or pressure the Fed to inflate. Those noting this risk of this strategy have been proven wrong by absence of any failure in this area . When the future budget crisis hits in this area, it will dwarf our current crisis by a factor of 10.

I don’t see how the risk of a complex system can be correctly calibrated without massive failure, because there are just too many incentives to rationalize risks as being under control as long as the system is working, and so it just continues until failure. They are an endogenous risks to our system, so the economy will never achieve a steady state, which given over a hundred years of business cycles, is a pretty safe forecast. The bigger question is, in my mind, is why don't large systems fail more often. That is, the average annual corporate default rate in the US is around 1.4%, over good and bad times, which is pretty low.

3 comments:

Anonymous said...

aha, endogenous and has to continue until failure when we pretend it was unexpected. you can see it from lehman emails. kedrosky has a couple up there.

Anonymous said...

Only Goldman Sachs (and possibly a few obscure hedge funds) can claim to have called the subprime crisis - they traded on it.

Liz said...

For a great look at major information disasters read "Deadly Decisions" by Chris Burns. How did we reach the wrong decision on a number of disasters, including the Challenger? This book presents great research into how the size, structure and loyalty of groups leads us to make deadly decisions. It's very eye-opening.