Democratic Underground Latest Greatest Lobby Journals Search Options Help Login
Google

The lesson quants don't want to hear -- Highly optimized systems are prone to catastrophic failure

Printer-friendly format Printer-friendly format
Printer-friendly format Email this thread to a friend
Printer-friendly format Bookmark this thread
This topic is archived.
Home » Discuss » Topic Forums » Economy Donate to DU
 
steven johnson Donating Member (1000+ posts) Send PM | Profile | Ignore Sat Sep-12-09 11:11 PM
Original message
The lesson quants don't want to hear -- Highly optimized systems are prone to catastrophic failure
Edited on Sat Sep-12-09 11:18 PM by steven johnson
Tommorow's New York Times has an article 'Creating Quant Models That Are Closer to Reality.' The assumption in the article is, "That failure suggests new frontiers for financial engineering and risk management, including trying to model the mechanics of panic and the patterns of human behavior."

They are ignoring the the study of complex highly optimized sysytems governed by nonlinear processes since Rene Thom first defined Catastrophe Theory in 1948 based on crashing preditor-prey populations in the wild governed by simple differential equations.

One of the basic implications of optimization theory is that HIGHLY OPTIMIZED SYSTEMS ARE PRONE TO CATASTROPHIC FAILURE.

They won't succeed.




Networks and systems where the nodes have power law distributed connections can be resistant to random failure, but are vulnerable to carefully targeted attack.

Highly Optimized Tolerance (HOT).

HOT theory examines robustness and catastrophic failure in large complex systems. HOT applies to complex systems which show optimal behavior in some dimension. For example, one of the early abstract models of a HOT system involves a theoretical forest. Random "sparks", with a particular distribution, land in the forest and burn areas of trees that are interconnected. A forest manager can build fire breaks that separate one section of the forest from another, preventing the spread of a forest fire caused by a random spark. An optimized forest is one that preserves the number of trees, while minimizing the damage caused by forest fire. Using a variety of optimization algorithms, Carlson and Doyle found that the optimized abstract forest was robust for one distribution of sparks, but could catastrophically fragile in the case of another distribution. The losses in the forest fire model followed a power law.

Many systems exist in optimized states, either as a result of design or evolution. In the case of an evolved system, they resist failure in the environment in which the system evolved. Designed systems resist failure in the cases envisioned by the designers. Small events, which have never been encountered before or which the designers did not expect can cause catastrophic system failure... Seemingly small changes can cause large scale global effects. A butterfly flaps its wings and the system crashes.
... Complex systems also exist in the society around us, particularly in the financial system.
The financial system in the modern world is also an optimized system that can be viewed as a strongly hub based network. The hubs consist of the large banks (e.g., CitiCorp, United Bank of Switzerland (UBS), Bank of America)...One might expect that a system as large as the US financial system would only be vulnerable to large events (e.g., the demise of the Internet bubble). However, there is reason to believe that the US financial system, like any highly engineered optimized system, is prone to failure caused by small unexpected events.



Linked: The New Science of Networks


Printer Friendly | Permalink |  | Top
scentopine Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Sep-13-09 12:15 AM
Response to Original message
1. Agree - as engineer i've seen a few optimization failures -
Edited on Sun Sep-13-09 12:16 AM by scentopine
To be a "good" or credible economist means you have to communicate in the most abstract mathematical language. They want to be like quantum physicists, trying to explain the world in perfect mathematical models. They more they try, the more they fail.

There is no margin in today's engineering (economic or otherwise) - everything is hinged on an unrealistic minimal deviation from simplistic design assumptions.
Printer Friendly | Permalink |  | Top
 
On the Road Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Sep-13-09 01:24 AM
Response to Reply #1
2. While I am Not an Engineer
I thought that in civil engineering quite a large margin had to be built into a project.
Printer Friendly | Permalink |  | Top
 
Mopar151 Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Sep-13-09 05:37 AM
Response to Reply #2
3. Depends
On what you consider a large margin. And most CE projects are bid out, which is a powerful incentive to reduce those margins. If we could get more politicans to think like you, the roads would look a lot better...
Printer Friendly | Permalink |  | Top
 
scentopine Donating Member (1000+ posts) Send PM | Profile | Ignore Sun Sep-13-09 10:52 AM
Response to Reply #2
4. Great question, worst cases - boston tunnel, shuttle , concorde
I think in general that is true for civil but it takes years to know for sure. Computer modeling allows you to design with minimal margins. The examples I gave happened with only the smallest deviation from ideal conditions.

Here is a more subtle example- there is a bridge in vancouver canada called Alex Frasier. I drove across it once and the wind was just right that it lifted and tilted the bridge deck creating cable slack on the windward side. The wind was probably 30- 40 mph. Those slack cables were whipping around back and forth like giant rubber bands, the bridge deck was vibrating and swaying like a small earthquake - it was amazing. If you look up the bridge you'll see it is massive. I have to believe it is in the design but there must have been massive stresses on both sides - everything must be perfectly designed to prevent failure under these conditions. Dynamic conditions like this are difficult to model.

Boston tunnel will be a static study for years to come - they used epoxy to anchor bolts holding roof slabs made of concrete - the epoxy formula was slightly wrong, accusations of cost cutting were lobbed. Each of these disasters caused by the slightest deviation from ideal conditions.

However, the mathematics of economic "engineering" is the ultimate voodoo. They never take into account the cold fact that controlling interests in the "investment" business share the same behavioral and motivational characteristics of petty criminals (no lectures on generalizations until Wall Street stops precipitating national and global financial disasters and gives us some role models and/or our democratic leadership lashes out at wall street with teeth and head chopping - I'm not holding my breath for that).

Most popular mathematical economists are full of shit (no problem with this generalization for reasons above). In engineering and physics you can verify the validity of the math with real world results *before* you implement them.

With mathematical models, the best you can do is "back date" using data from the past to predict the future. Then you implement and take a reading in real-time (like during peak of dot com and during peak of housing bubble and during peak of wall street bubble) and then they proclaim that the macro economic problem is *solved* and world prosperity is the proof. In other words, they throw their theories into practice and revise them after a billion people suffer. Likewise with "trickle down" bullshit.

Back-dating could not take into account the removal of regulations, the sleazy SOBs at SEC, the new derivative "instruments", sub-prime markets, and raw unchecked greed and fraud by our politicians and industry titans that lead to economic disaster that we have today.

Legions of people should be in jail for this.





Printer Friendly | Permalink |  | Top
 
eridani Donating Member (1000+ posts) Send PM | Profile | Ignore Mon Sep-14-09 04:34 AM
Response to Original message
5. Hasn't that principle been around for awhile?
A lot of redundancy makes systems stable at the cost of efficiency. Biologists and chemists are aware of it at any rate.
Printer Friendly | Permalink |  | Top
 
wuvuj Donating Member (874 posts) Send PM | Profile | Ignore Mon Sep-14-09 06:38 AM
Response to Original message
6. Mauldin had a piece on....
Edited on Mon Sep-14-09 06:39 AM by wuvuj
...the causes of the failures of complex systems maybe 1 or 2 years before the crash. I think he specifically mentioned the derivatives.
Printer Friendly | Permalink |  | Top
 
DU AdBot (1000+ posts) Click to send private message to this author Click to view 
this author's profile Click to add 
this author to your buddy list Click to add 
this author to your Ignore list Fri Apr 26th 2024, 03:51 AM
Response to Original message
Advertisements [?]
 Top

Home » Discuss » Topic Forums » Economy Donate to DU

Powered by DCForum+ Version 1.1 Copyright 1997-2002 DCScripts.com
Software has been extensively modified by the DU administrators


Important Notices: By participating on this discussion board, visitors agree to abide by the rules outlined on our Rules page. Messages posted on the Democratic Underground Discussion Forums are the opinions of the individuals who post them, and do not necessarily represent the opinions of Democratic Underground, LLC.

Home  |  Discussion Forums  |  Journals |  Store  |  Donate

About DU  |  Contact Us  |  Privacy Policy

Got a message for Democratic Underground? Click here to send us a message.

© 2001 - 2011 Democratic Underground, LLC