Financial markets are at risk of a ‘big data’ crash


Regulators and investors are struggling to meet the challenges posed by high-frequency trading. This ultra-fast, computerised segment of finance now accounts for most trades. HFT also contributed to the “flash crash”, the sudden, vertiginous fall in the Dow Jones Industrial Average in May 2010, according to US regulators. However, the HFT of today is very different from that of three years ago. This is because of “big data”.

The term describes data sets that are so large or complex (or both) that they cannot be efficiently managed with standard software. Financial markets are significant producers of big data: trades, quotes, earnings statements, consumer research reports, official statistical releases, polls, news articles, etc.

Companies that have relied on the first generation of HFT, where unsophisticated speed exploits price discrepancies, have had a tough few years. Profits from ultra-fast trading firms were 74 per cent lower in 2012 compared with 2009, according to Rosenblatt Securities. Being fast is not enough. We, along with Marcos Lopez de Prado of the Lawrence Berkeley National Laboratory, have argued that HFT companies increasingly rely on “strategic sequential trading”. This consists of algorithms that analyse financial big data in an attempt to recognise the footprints left by specific market participants. For example, if a mutual fund tends to execute large orders in the first second of every minute before the market closes, algorithms able to detect that pattern will anticipate what the fund is going to do for the rest of the session, and make the same trade. The fund will keep making the trade, with higher prices, and the “algo” traders cash in.

(read more...)