Equities trading floors have for nearly two decades been obsessed with speed, as technology has allowed computers to transact in milliseconds or even microseconds. A lot of this has been around hardware and data feeds, but today unstructured data and articial intelligence are the new differentiators.
The reliance on servers and feeds led to regular worries in the industry of a “race to the bottom”. Latency – the time required for an action to generate a response – seemed poised to max out on its marginal return on investment.
Yet today, robots competing for lowest latency dominates most stock turnover. Today on some U.S. exchanges, tick-to-trade latencies (the time from a bid being placed to execution of the order) occur in nanoseconds.
“Declaring that the latency race is over goes back at least 12 years,” said Peter Lankford, executive director of STAC, a company that coordinates financial institutions and vendors to develop technology standards in capital markets. “But we’re still seeing competitive latencies fall by a factor of 10 every three years.”
While new latency-reducing technologies are often so expensive that initially only an elite set of firms can afford them, economics on the vendor side tend to drive down costs and make them available to a broader set of firms over time.
“This dispersion of innovation fuels a continued latency race across most of the industry,” Lankford said. “Everybody has to move along the curve. Even if you’re not in the nanosecond game, trades that might have been acceptable at, say, 100 microseconds a few years ago now require you to trade at 10 microseconds.”
The range of technologies and processes involved in high-frequency trading (HFT) is broad. Specialist companies have emerged to manage market-data feeds, messaging middleware, execution workloads, and connecting software applications to networks.
Within networks, there’s yet another layer of specialists who run servers, operating systems and APIs, among other things. Then there are the software specialists to help firms read market data, affix a time stamp to a transaction, or assemble a FIX message at lightning speed.
Exchanges are constantly upgrading their hardware to attract the most flow, which means making it easier to access and use by data scientists at trading firms and the exchange itself.
A data architecture consultant at a major Asian exchange says a big focus now is how venues store data, and how efficiently they can make it available to their data scientists. Sometimes this means using cloud vendors, other times it’s more about improving data access to servers kept on-premise.
The rise of alternative data
Driving this is the rise of data as a competitive asset.
“There’s new sources of unstructured data,” said Srihari Angaluri, technical director for software and solutions at Lenovo in Raleigh-Durham, North Carolina. HFT shops are now incorporating things like social-media sentiment analysis into their trading models. “They need to connect quantitative, structured data with new types of data, and then extract intelligence from that. There’s a very rich area for financial services to use A.I.”
“The excitement is in A.I. and machine learning,” said John Ashley, general manager for financial services and technology at Nvidia, a San Francisco-based hardware computing company. “This is not to separate the winners from the losers – but to separate the survivors from the footnotes.”
He says many fintechs are at heart data companies that use data more intelligently to disrupt capital markets. Incumbents – banks, big asset managers – have plenty of data at hand, unlike fintechs, but only now are waking up to new competitive business models, such as outsourcing more R&D to fintechs and deploying their balance sheet to grow their data advantage.
Constant innovation is critical, but it’s also expensive, and the lack of industry standards mean that firms have to be able to diversify and adapt.
“To be a survivor,” Ashley said, “be adaptive, fail often, and fail fast.”