Tuesday, August 16, 2011

A Perennial Risk Problem

Cash strapped Detroit recently announced it won't be responding to home alarms: 99% of them are false. A naive risk solution is to itemize everything that can go wrong, but in doing so it is as bad as not mentioning any risks: moderation in all things, in this case, the perennial balance between type 1 and type 2 errors. Hyperactive risk reports have the benefit of rarely being 'wrong', just not useful, because after a short while decision makers get used to ignoring these risks. I remember the first time I bought a house, and didn't know what to look for in the inspection. I hired someone to do this for me and for a couple hundred bucks I got a list of over 100 items that were not prioritized, which I found totally unhelpful, but I had to pay him (he obviously did work).

The 1986 Challenger Space Shuttle disaster was a great example. No fewer than 748 parts were designated 'Criticality 1', meaning they violated NASA's redundancy criterion: if they failed the shuttle would be lost. In the first 25 flights up through its last, 131 technical flows proved serious enough to warrant NASA's designation of 'launch constraint.' Of these, 66 were resolved after 1 flight, the rest, like the O-ring joint that ultimately failed, overridden repeatedly. If there are that many high level risks going off, that's what happens.

Real risk reports prioritize risks in a way proportional to their expected damage: probability times cost. Many times these probabilities are so small, they are rather qualitative, but such is risk. Nonetheless, Detroit reminds us that merely saying anything or everything can go wrong, while true, is quite useless and not profound. Enumerating a long list of disparate things that may happen, without any probabilities, may work for Nouriel Roubini, but he's a charlatan: here he is last week taking credit for his client switching to cash a couple months ago, he doesn't mention he has been suggesting investors go to cash since 1990, always for a slew of reasons (though Lawrence Summers is here giving props to Roub for calling the housing crisis). I have seen reports that endlessly enumerate risks many times in large corporations, and they are highly correlated with people who either are too afraid to make a mistake, or profoundly do not understand their job.

4 comments:

r2d2 said...

A lot of wisdom seems to be based on the analysis of only one type of error, ignoring the other. I think we tend to focus on the type of error that has more visible and measurable negative consequences when it occurs. For instance " better that ten guilty persons escape than that one innocent suffer". It's more natural to feel sorry for the innocent who was wrongly imprisoned, than for and unknown number of other innocents that have to suffer because of the harm the 10 criminals that you let go are very likely do to them. As with other things, we decide in such a way as to minimize the emotional stress that we expose ourselves to, rather than maximize the overall result of our actions.

Anonymous said...

Even a stopped clock tells the right time twice a day.

Charles Butler said...

Totally amazing that Alphaville would link to this, doncha think? Perhaps a bit of self-criticism...

Candide said...

I'm not an economist at all, but from the purely analytical point of view one might add that there's an option C: people who profoundly do not understand their job and therefore are afraid to make a mistake.