Indeed, Alan Greenspan describes this failure well:
"Those of us who have looked to the self-interest of lending institutions to protect shareholder's equity (myself especially) are in a state of shocked disbelief. … It was the failure to properly price such risky assets that precipitated the crisis. In recent decades, a vast risk management and pricing system has evolved, combining the best insights of mathematicians and finance experts supported by major advances in computer and communications technology. A Nobel Prize was awarded for the discovery of the pricing model that underpins much of the advance in derivatives markets. This modern risk management paradigm held sway for decades. The whole intellectual edifice, however, collapsed in the summer of last year because the data inputted into the risk management models generally covered only the past two decades, a period of euphoria."— Testimony of Dr. Alan Greenspan, US House of Representatives Committee on Government Oversight and Reform, October 23, 2008
He's right of course. Though, I get the sense that I and many others are much less shocked than he. Perhaps it is a feature of academic economics, which tends to put on blinders when considering issues of human psychology or chaotic complexity that can't be easily reduced into a neat, tidy equilibrium model, but many people saw the writing on the wall before mainstream economists did.
But Greenspan does a (sort of) brave thing by fingering the failure of the pricing "model" as the core reason why the market did not, could not, properly price the risks it had taken on (issues of faulty information and ratings fraud aside).
Enter Nassim Taleb, quantitive trader and author of Fooled By Randomness and The Black Swan. Both of his books focus on the failures of individuals and institutions to properly understand randomness or risk and the frequent misapplication of statistics. Back in September he wrote an original article for edge.com entitled "The Fourth Quadrant: A Map of the Limit of Statistics." As Taleb writes:
"Statistical and applied probabilistic knowledge is the core of knowledge; statistics is what tells you if something is true, false, or merely anecdotal; it is the "logic of science"; it is the instrument of risk-taking; it is the applied tools of epistemology; you can't be a modern intellectual and not think probabilistically—but... let's not be suckers. The problem is much more complicated than it seems to the casual, mechanistic user who picked it up in graduate school. Statistics can fool you. In fact it is fooling your government right now. It can even bankrupt the system (let's face it: use of probabilistic methods for the estimation of risks did just blow up the banking system)."(read the whole article here)
This seems like a critical piece of the puzzle of what went wrong with Greenspan's models. We are in this mess largely because of the success of models, models that rely on statistics and measurements of our existing economy. Our society, however, lends too much respect to highly complex, very mathematical, jargonistic models that use detail and short-term success to browbeat detractors into submission when in fact they are fundamentally and systematically flawed* - flaws that come out in the long term, with serious consequences. This is a serious critique of economic science in general - one I'm not sure can be easily met.
Look at the last sentence in the Greenspan quote: "the data inputted into the risk management models generally covered only the past two decades, a period of euphoria." Here's one obvious problem: your "data" may only be the result of things that haven't changes lately (like house prices), thus trends can be confused for underlying principles. Furthermore, complex, chaotic systems (like an economy full of people) are characterized by a low number of highly significant events. With a small sample size and very complex payoffs, statistics becomes, if not impossible, very constrained. As a result, we tend to base our models on the things that are easily observable and not on the really significant events which are very hard to predict.
Taleb finishes up with some meta-suggestions about living in what he terms "extremistan." Of these I was most impressed by his admonition to avoid optimization, or maximization and to embrace redundancy. The tighter we are to efficient, the fewer resources we have to fall back on when something breaks or turns out unexpected. The financial firms wanted to maximize returns, so they had all their capital out, leveraged, raking in the returns. Cash on the other hand has poor returns, so why keep any around?
He draws the obvious analogy to biological systems, which are anything if inefficient (how many blossoms does a cherry tree really need?) but highly redundant - and have survived for millions of years, through huge upheavals. Redundancy is important. "You certainly pay for it [in the short term], but it may be necessary for survival."
Check out the whole article. A must read.
* Hat tip to Gavin McCormick