Bill Werner is an Engineer in Missouri City, Texas. The following is his review of this year’s “worst call” and an attempt at drilling down to the ultimate problem.

~~~

>

“You will see when you can swallow the world in one gulp.”
-Zen Aphorism

>

My submittal for the worst call is that “housing markets can not fall simultaneously across the U.S. (and U.K.)” by analysts and quants that then plugged that call as an assumption into their credit default swap (CDS) models resulting in the near (!?!) collapse of the capital markets. This call based on past empirical data that regional housing markets had never gone down simultaneously before is the result of very smart people falling prey to the psychology of idiots and refusing to actually think about a problem and take the risk of ruin seriously.

This utter refusal to take the risk of ruin seriously is reminiscent of Long Term Capital Management (LTCM). If a model builder does not take into account the possibility that the model may not be perfect it is bound to fail at the worst possible time. Rather the model builder must incorporate a serious risk of ruin component in it and have the discipline to abide by it. In the final analysis there is no perfect model. And when something goes wrong in the real world, it does not care about Nobel Prizes, statistics, numerical methods, infinite-dimensional optimizations etc. All that elegance and sophistication may actually serve to blind us to the possibility of a spectacular systemic failure.

The empirical data aspect of our present error reminds me of sitting in an undergraduate macro-economics course in 1968 and the professor spouting that the overwhelming consensus from economists was that the business cycle was about to be eliminated by monetary policy. He then went on to explain that the lynchpin behind his assertion was the Phillips Curve which at the time was based on over a century of empirical data “proving” the inverse relationship between inflation and unemployment. A class of science and engineering geeks was skeptical and in today’s terminology predicted the possibility of a black swan. Policy at the time was based on the Phillips Curve and similar propositions. Ultimately the students were proven right when the inverse relationship between inflation and unemployment broke down in the 1970’s resulting in a new word to describe the phenomenon – stagflation. Paraphrasing Soros’ guru the late philosopher Carl Popper, empirical data can never rigorously prove anything.

Take a common model, a restaurant menu. A menu is a model of meals that can be ordered and eaten. Many of our model makers tend to get lost in the elegance of the models of their creation and in effect indulge in eating the menu, mistaking the model for reality. This may be harmless or result in serious indigestion or worse. A risk of ruin component has to get back to the external world in such a way that takes into account the possibility that we may be “eating the menu.” It can happen to any of us. Moreover it should be pointed out that “eating the menu” or taking a virtual model to be more real than the world we actually live in is not limited to finance but is rampant throughout business today. This may have always been the case but today’s computational power combined with lightening communications has magnified the effects to a stunning degree. This is a real “Rabbit Hole”.

If you think about it, this Rabbit Hole presented by modeling is rather obvious for finance, business and even models in science. But it is much deeper and wider than one might at first suppose. Remembering that Lewis Carroll was a master of Symbolic Logic and following the White Rabbit, it is obvious that we use the languages of mathematics to construct models, but what about our languages of daily discourse? We live in a Vast Matrix of Models of Mind Stuff that are as dangerous as they are powerful. Just look and see…

Category: Bailouts, BP Cafe, Hedge Funds

Please use the comments to demonstrate your own ignorance, unfamiliarity with empirical data and lack of respect for scientific knowledge. Be sure to create straw men and argue against things I have neither said nor implied. If you could repeat previously discredited memes or steer the conversation into irrelevant, off topic discussions, it would be appreciated. Lastly, kindly forgo all civility in your discourse . . . you are, after all, anonymous.

10 Responses to “MODELS & THE RISK OF RUIN”

  1. worth says:

    Here’s a model of the visual/mental type, no math required:
    Business cycle is not a cycle, it is not a bubble, it is a balloon.
    It inflates, gets to the point where it’s too tight to expand anymore, then has a pressure release valve engage. However over-inflated it is, the resulting pressure will force itself out that much more or less forcefully. Once the pressure is down, it can start re-inflating once more. Nothing ever pops, no graphs or charts or trendlines apply. Oh, and each time it inflates, it softens the balloon walls a little more, so it can inflate a little more each time than the last (before inevitably deflating again, either under the force of its own pressure or through governmental actions taken to either let a little air out ahead of time or to force it to quickly and briefly over-inflate past its natural release point).

  2. dwkunkel says:

    This is an example of what I call board room incest. A group of like minded people that share the same backgrounds and perspectives agree on a course of action based on their shared view of reality. If you throw in the usual fawning sycophants, you end up with an enthusiastically endorsed decision that can’t fail.

    You end up with something like the Pontiac Aztec.

  3. Good point, bad example:

    A post, and one person agreeing, is hardly the sort of reflexive echo chamber you see ont he extreme right and left.

    But I understand the point you were trying to make . . .

  4. nnnick777 says:

    All of this brings to mind the book “Normal Accidents” by Charles Perrow. He posits that as systems become increasingly complex and interdependent, catastrophic accidents become the norm rather than the exception. His examples include the space shuttle and nuclear power, but the concept easily applies to financial markets too.

    This concept would also seem to imply that to reduce systemic risk, we should seek to reduce the complexity of the entire system.

  5. leftback says:

    I dated a model once. Almost ruined me…

  6. Bill Werner says:

    Complexity is certainly a problem. In Refineries and Chemical Plants the Emergency Shutdown System (ESD) or Safety Instrumented System (SIS) is normally completely separate from the Distributed (Process) Control System (DCS). This is done to keep the ESD/SIS as simple as possible (but as Einstein would say no simpler). For added reliability, ESD/SISs often have at least one layer of redundancy sometimes with voting.

    There is a whole technology relating to those issues including things like Safety Integrity Levels (SIL), failure rates (MTBF), Risk Reduction Factors (RRF) etc. In addition there are often smaller alarm systems that alert operators, technicians and engineers of problems but do not shutdown the plant or operating unit. There is also a great effort to make systems that are “idiot proof”.

    But catastrophic failures still do occur. And when they do there are usually serious investigations especially if there are fatalities.

    Reference: “Control Systems Safety Evaluation and Reliability” 2nd Ed. by William Goble, isa.org, 1998

  7. aurora17 says:

    Models are strictly GI/GO. They’re only an approximately to reality — real or anticipated. Their value is almost proportional to the resources expended in developing and maintaining them. Those who use them should be aware of those limitations.

    A typical simplifying ruse in model making is to linearize a representation of a phenomenon (e.g. efficiency of a distribution system vs its cost, number of nodes, modes of transfer ) to simplify the math and its cost of development. At best they may prevent a decision maker from a disastrous mistake — but recall the prescient but incredibly vapid pronouncements by the unlamented and departed Donald Rumsfeld about categories of ‘knowns’ and ‘unknowns’ that we may or may not know.

    As regards inquiries about possible outcomes about hypothetical events, a very useful application of models, it’s worth reading the book, ‘The Black Swan’, by Nassim Taleb. Had I thought about its lessons last Spring, and be prescient enough to apply them, I’d be a lot richer today.

    aurora17

  8. steve says:

    Basically all I see in this essay is

    1) reality don’t care about all yer fancy degrees.
    2 models don’t always work.

    Then he takes a remark he remembers from an economist who was probably just knowingly making an overgeneral statement because he was talking to a class of undergraduate non-economics majors, and said ‘Us engineers knew better than that guy’.

    Frankly I don’t know why Mr. Retholtz published this, it’s so uninformative.

    It does remind me of something though. Go to the wacko Intelligent Design websites like Uncommon Descent and you’ll find that an unusually high number of engineers there, sneering at how dumb and naive professional biologists are about biology, claiming they’re better than those evolutionist eggheads because engineers are out ‘in the real world’ and ‘lives are on the line’ etc. Or read Steven Den Beste’s old essays about why the Iraq War was going to be the best thing ever. He made a big deal about how his engineering training made him a better analyst than the so-called-professionals too.

  9. karen says:

    the essay irked me, as well. i would argue that the model didn’t fail on the basis of home prices. the model failed on the basis of widespread, fraudulent lending practices. when in history was money ever lent without concern for it being repaid? that is what led to this absurd and obvious housing bubble.

    the essay did engender some good comments, tho.

  10. Bill Werner says:

    [Revision & Clarifiication of above comment with trading analogies]

    Complexity is certainly a problem. In Refineries and Chemical Plants automatic control is simplified by separating functions:

    1. Process Control System DCS (Analogous to the Model in the article)
    2. Emergency Shutdown System ESD (Analogous to a Risk of Ruin feature in the article)
    3. Alarm Systems (These are early warning systems not tied into the above DCS or ESD. This is looking for things that are not necessarily included in the above controls. Analogous to an unexpected bad report not usually considered in the model)

    The DCS or Distributed Control System essentially contains a Model of the Process being controlled and is normally automatic once it is tuned properly. This would also be analogous to a trading system generating buy and sell signals based on a model.

    The ESD or shutdown system automatically shuts down the plant (operating unit). This would be like a stop loss provision that automatically closes a position in a trading system.

    If an Alarm System is giving an operator a warning he can take actions before the emergency shutdown system. In a trading system this might be a condition that causes a trader to reduce a position. Or in the present situation it was widely known that NINJA (bad) mortgages were being issued. Where were the alarm sirens?

    The following is where the analogy starts to breakdown. Refineries and Chemical Plants are dangerous places where there is the potential for immediate loss of life. And while many processes are often not understood to the detail we would like, how to measure and control them is fairly well understood.

    The situation with in finance is that primary data is often revised and the operating rules can change with little warning as we see in the current crisis. One of the points of the article is to be on the lookout for incorrect assumptions in the mathematical model or the verbal model of what we think is going on. Sometimes our math models and mental models can blind us to what is actually going on. Or as Will Rogers put it: “It’s what we think we know that just ain’t so that really hurts us.”