Risk and CDO's

This is one of the better simple explanations of both the appeal and hidden risk of CDO's. The example, which is short and is worth working through, ends this way:

Suppose that we misspecified the underlying probability of mortgage default and we later discover the true probability is not .05 but .06.  In terms of our original mortgages the true default rate is 20 percent higher than we thought--not good but not deadly either.  However, with this small error, the probability of default in the 10 tranche jumps from p=.0282 to p=.0775, a 175% increase.  Moreover, the probability of default of the CDO jumps from p=.0005 to p=.247, a 45,000% increase!

The dark magic of structured finance conjured many low-risk securities out of many risky securities.  Like all dark magic, however, the conjuring came at a price because if you didn't get the spell exactly correct it was easy to create something much more risky and dangerous than you were likely to have ever imagined.

As an ex-engineer who used to do a lot of operations analysis as well as post-disaster failure analysis, this shares a central theme that I have found in many such failures -- people tend to overestimate their own knowledge.

Coming in to a class at HBS, the professor had us all do a 20 question survey.  It asked us questions like "what is the population of Argentina" and then asked us to give the lowest and highest number we thought it would be such that the answer had a 95% chance of being in that range.  Based on this, only one of our 20 answers should have been out of my limits.  About eight of the answers were out of my ranges.  It was a really good lesson in overestimating one's knowledge.

Which leaves me with a thought -- if we define a large part of the problem as overestimating our understanding of a certain phenomenon, from your observation of the Obama administration and its personalities, what gives you any confidence that a new lager of government regulators will solve this problem?

4 Comments

  1. NormD:

    I assume that the "experts" that created and invested in CDOs understood this phenomena?

  2. IgotBupkis:

    I'd like to see that test. Got a source, link, or a copy available?

  3. rsm:

    Yepp, but they believed they were right and overestimated their certainty of course. Being human, this is a matter of course.

  4. rxc:

    There are a lot of initiatives in the Federal government to move toward this sort of "risk-informed" regulation. It started in the late 90s, when the Republican congress pushed the regulators (many of whom do NOT, actually, work for the administration in power, but rather for the Congress - it is a complicated system), and got a LOT worse with Bush II. The theory is that govt needs to take more risks in its decisionmaking, just like business does (did?). We got lectures from consultants who had come from Authur Andersen (remember them?) about re-inventing govt and how to change our culture to embrace the change (Change is good, change is good, change is good...).

    I was a vocal opponent of this crap, and I once asked the head consultant (Louie) whether he had ever done this in another govt agency, and he admitted that we were the first he had tried - he was sure it worked, because the financial business and GE and other big companies used this technique, and he had the data to prove it. In the middle of all this, Enron collapsed, and the AA consulting arm became Acenture, but the crap continued.

    We eventually got an agency head who was a big Bush supporter who tried to pass a rule to make our most fundamental regulations more "risk-informed", to the point that we did an enormous amount of work trying to quantify many sources of uncertainty, and many people bought into this in a big way. Then we had an "interesting event" at a plant in the mid-west that would not have shown up in all of the calculations (what the financial people call a black swan event - we call it an outlier), and we paused, but not for long, and eventually we came dangerously close to passing this risk regulation. I like to think that I had a part in causing it to fail, because I talked to some very influential people who had to pass on it, and I gave them some examples of future scenarios that I consider likely to occur if this occurs. They advised that the issue needed to be studied some more, and it was essentially shelved. The head of the agency has left, and I don't think it will come up again, especially in light of the financial meltdown.

    BTW, I used to regulate nuclear power plants, so you can see that this ignorance of outliers might have had some exciting consequences...