Tommorow's New York Times has an article '
Creating Quant Models That Are Closer to Reality.' The assumption in the article is, "That failure suggests new frontiers for financial engineering and risk management, including trying to model the mechanics of panic and the patterns of human behavior."
They are ignoring the the study of complex highly optimized sysytems governed by nonlinear processes since Rene Thom first defined Catastrophe Theory in 1948 based on crashing preditor-prey populations in the wild governed by simple differential equations.
One of the basic implications of optimization theory is that HIGHLY OPTIMIZED SYSTEMS ARE PRONE TO CATASTROPHIC FAILURE.
They won't succeed.
Networks and systems where the nodes have power law distributed connections can be resistant to random failure, but are vulnerable to carefully targeted attack.
Highly Optimized Tolerance (HOT).
HOT theory examines robustness and catastrophic failure in large complex systems. HOT applies to complex systems which show optimal behavior in some dimension. For example, one of the early abstract models of a HOT system involves a theoretical forest. Random "sparks", with a particular distribution, land in the forest and burn areas of trees that are interconnected. A forest manager can build fire breaks that separate one section of the forest from another, preventing the spread of a forest fire caused by a random spark. An optimized forest is one that preserves the number of trees, while minimizing the damage caused by forest fire. Using a variety of optimization algorithms, Carlson and Doyle found that the optimized abstract forest was robust for one distribution of sparks, but could catastrophically fragile in the case of another distribution. The losses in the forest fire model followed a power law.
Many systems exist in optimized states, either as a result of design or evolution. In the case of an evolved system, they resist failure in the environment in which the system evolved. Designed systems resist failure in the cases envisioned by the designers. Small events, which have never been encountered before or which the designers did not expect can cause catastrophic system failure... Seemingly small changes can cause large scale global effects. A butterfly flaps its wings and the system crashes.
... Complex systems also exist in the society around us, particularly in the financial system.
The financial system in the modern world is also an optimized system that can be viewed as a strongly hub based network. The hubs consist of the large banks (e.g., CitiCorp, United Bank of Switzerland (UBS), Bank of America)...One might expect that a system as large as the US financial system would only be vulnerable to large events (e.g., the demise of the Internet bubble). However, there is reason to believe that the US financial system, like any highly engineered optimized system, is prone to failure caused by small unexpected events.
Linked: The New Science of Networks