A recent study by Tabb Group reported that the level of confidence in the markets continues to fall with every technology-associated market crash, based on the survey of 260 market participants. The study reported that following Knight Capital’s fiasco, 26% consider the present market structure to be “very weak.” By contrast, in June 2012, only 7% of market participants deemed market structure “very weak,” and in May 2010 only 3% of the polled parties thought so.
One of the regulatory solutions to boosting market structure mentioned in the poll is the idea of slowing the markets down. In fact, according to the poll, 31% of asset managers, 20% of broker-dealers, 18% of financial services vendors, and 10% of exchanges believed that slowing down markets “would help minimize the types of events we have seen in 2012, or, more broadly, help the industry regain the trust of investors.”
While the idea of slowing down markets did not generate a majority vote among any specific group, the support for the measure is quite surprising. It is particularly surprising to find 20% of broker-dealers backing the idea, as profitability of their primary businessâ€”market-makingâ€”grows with increases in trading speed.
A study of market-making on actively traded stocks on the Stockholm Stock Exchange, for example, found that the expected profit on limit orders increased as the time duration between market orders decreased. The study, written by Sandas (Review of Financial Studies, 2001) and confirmed by Beltran, Grammig and Menkveld (working paper, 2005), shows that market-makers’ profitability does not grow as they wait their turn at the exchange; instead, market-makers’ profit grows as their order matching frequency increases. In other words, modern market-makers’ profits are directly related to the number of market orders they service; the more trades the market-maker can take on within a given fixed period of time, the higher is the market-maker’s profitability.
By contrast, market-making theories of the 1980s presumed that market-makers’ profit increased with their waiting time to order execution, identifying slower execution as a driver of higher profitability. The thought went that market makers were patient traders and were compensated for their time making markets, not for the number of trades market-makers took on. Such models reflected exchange conditions circa 1970s. Changing the details of a limit order and cancelling orders was prohibitively costly, and market makers indeed expected a tidy compensation for bearing the risk of ending up in an adverse position once their limit orders were hit. In most modern markets, however, limit order cancellations and revisions can be performed free of charge at the time this book is written, and high-frequency market-makers enjoy better profitability than their low-frequency counterparts.
Still, some brokers who argue against fast execution are under the impression that the technology required for successful trading in today’s computerized markets is prohibitively expensive. The main reason behind such thinking also happens to be brokers’ past experience: just some fifteen years ago, the cost of purchasing technology required for a modern high-frequency setup was tens of millions of dollars. Today, comparative systems cost a few thousand dollars. The drastic reduction of technology in costs has been driven by two main developments, completely independent of financial services:
1. Demand for cheap yet powerful technology by numerous video-game players lacking funds
2. Overseas’ manufacturing capabilities
The very latest iterations of video games require even more drastic technological improvements: specialized chips to quickly receive and process information, much alike the demands of high-frequency markets. These chips, called Graphical Processing Units (GPUs) and Field-Programmable Gate Arrays (FPGAs), were originally designed for ultra-fast processing of video graphics. The chips are now making rapid inroads on Wall Street, where the technology is adopted by more serious applications, like market-making algorithms processing billions of dollars in positions a day. A blank FPGA chip costs as little as $100, yet, when properly programmed and installed in a regular PC, it can deliver performance similar to a cluster of thirty to three hundred interconnected computers. Such amplification of computer power delivers further savings to financial institutions seeking to modernize their technology via savings on computer power, ventilation and physical space.
In summary, fast markets improve market-makers’ profitability without outrageous cash outlays. The biggest cost of implementing modern market-making models lies in the know-how of designing and implementing of algorithms, a discipline hardly straightforward. The last two years, however, have witnessed an explosion of research and other information on the subject, bound to convert even the most hard-crusted anti-speed market-makers to the camp of profitable trading.
Irene Aldridge is teaching courses on design and implementation of algorithms in Chicago (September 5 and 6, 2012) and New York (September 20 and 21, 2012). For more information and to register, please visit http://www.hftcourse.com today. She will also be speaking on a panel about algorithmic trading at HedgeWorld New York 2012, to be held Sept. 19 at the Metropolitan Club.