<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/plusone.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://www.blogger.com/navbar.g?targetBlogID\x3d27708445\x26blogName\x3dWatchingTheHerd\x26publishMode\x3dPUBLISH_MODE_BLOGSPOT\x26navbarType\x3dLIGHT\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttps://watchingtheherd.blogspot.com/search\x26blogLocale\x3den\x26v\x3d2\x26homepageUrl\x3dhttp://watchingtheherd.blogspot.com/\x26vt\x3d-5251722771341847288', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Saturday, October 27, 2007

Anti-Virus Software for Your Portfolio?

The cover story (#1) on MIT's Technology Review magazine for November / December 2007 has a cover story on "quants" -- the geniuses in the financial industry who combined skills in financial analysis, mathematics, statistics and software development to create complicated financial derivatives and a myriad of automated strategies for profiting from them. The article, entitled Blow-Up, starts off with a rather over-the-top melodramatic description of an early August conference held in the New York offices of Merrill Lynch in which 200 of the "smartest people on Wall Street" gathered to collectively scratch their head about the inexplicable behavior of the market as the impact of the credit crunch began to emerge. From there, it actually provides a good introduction to the strategies for pricing derivatives that have evolved over the past thirty years ranging from the Black-Scholes method that attempt to model variability to later Monte Carlo based methods for estimating the variability of the variability in financial models made practical by ever-faster computers.

The most insightful two points of the article come towards the end and provide the most food for thought:

In the mid-1990s, [said Greg Berman of RiskMetrics] a good algorithm might trade successfully for three or four years. But the half-life of an algorithm's viability, he says, has been coming down, as more quants join the markets, as computers get faster and able to crunch more data, and as more data becomes available. Berman thinks two or three months might be the limit now, and he expects it to drop.

Sound familiar? Sound like the life span of the signatures used by your PC's anti-virus program?

A little later, the author includes this quote from Emmanuel Derman, noted professor of financial engineering (an actual college curriculum) at Columbia University:

Quantitative finance superficially resembles physics but the efficacy is very different. In physics, you can do things to 10 significant figures and get the right answer. In finance, you're lucky if you can tell up or down.

Curiously, the author of the article spends no time elaborating on these points. Except for those purely interested in the market meltdown as a mathematical problem solving exercise, these are probably the most important ideas to consider. Even more curiously, a column at the end of the magazine recounts an op-ed piece in the same magazine by Lester Thurow written after the October 1987 crash in which he stated human herd behavior rather than computers were to blame for that crash.

Go back to the comment on the half-life of computer trading algorithms for a second. Does an investing strategy based on finding firms that consistently grow earnings have a half-life? Does an investing strategy that cannot work if anyone else uses it sound like something on which to bet your retirement? We really may not have a choice.

The analogy of computer trading algorithms in financial markets to computer viruses in a worldwide network is very apropos. One would think the goals are diametrically different -- trading is intended to produce a good result, viruses are designed to produce problems in a network -- but think again. A hedge fund with a multi-million dollar bet based upon on finding a temporary price differential on a complicated derivative could spend vast amounts of time creating an algorithm to identify that delta before everyone else. They could instead spend less time simply attempting to manipulate the markets to produce the delta, execute their trades, then get out. With trades initiated from potentially thousands of accounts in thousands of locations, who would be able to reverse engineer the attempt amidst a market trading six billion shares daily?

The danger of even a few rouge traders operating in a vast, highly automated market with thousands of other traders depending on highly complicated but incompletely understood programs is identical to the danger of a few malicious virus developers turning their code loose in a worldwide network of complicated PCs with complicated software that isn't well understood either. Combining complicated, automated trading strategies which aren't always assured of functioning correctly to vast computer networks to obtain real-time information to drive trades when those networks are prone to random catastrophic faults makes it pretty apparent that somewhere out there lurking in the future is a "Melissa" for the market. The implication is even worse because the market mechanisms have evolved to the point where typical human herd behavior isn't necessarily the dominant force in either direction. The dominant force may be nothing more than a ghost in the machine, in the form of flawed human algorithms implemented in flawed software running on flawed networks.


#1) http://www.technologyreview.com/Infotech/19529/?a=f