Financial services institutions are under great pressure to ensure that they have appropriate infrastructures in place to handle throughput of rising market data volumes, and so are turning to technology to cope. Indeed, spend by European and US buy side firms on front office market data infrastructure is set to reach $484 million by 2009, while that by sell side firms will peak at $1.9 billion.
Market data has always been a fundamental element within capital and financial markets. However, upcoming regulations such as Regulation National Market System (Reg NMS) and Markets in Financial Instruments Directive (MiFID), will place immense pressures for accuracy and transparency. This has raised the significance of market data and therefore brought this issue back on the strategic agenda.
Investment into market data infrastructures will continue to grow
As the global economy has expanded, market data requirements have continued to grow. According to Datamonitor research, the total European and US financial services industry front office market data infrastructure IT spend currently stands at $2.1 billion.
The upcoming increase in market data will have profound impacts on market data infrastructures. One impact will be the need for storage and another will be ensuring that the firm in question has relevant analytics in place to deal with the anticipated increase in volumes. Forthcoming regulations state that firms need to store data for five years and, with annual storage doubling annually, pressure on the systems to accommodate the data will be immense. Given the storage pressure on systems, IT spend on market data storage is expected to peak in 2008.
Maintaining data quality a major hurdle throughout trading lifecycle
A key element sometimes overlooked is the issue of data quality, as many financial services institutions (FSIs) are embroiled in their search for low latency. A Datamonitor respondent encapsulated this sentiment by saying, everyone is looking for low latency, but the faster you go, the more opportunities for mistakes there are. If you are taking in as much raw data as possible you have to make sure it is good data. This means firms should introduce data cleansing solutions, which would add latency.
Additionally, as low latency solutions become increasingly commoditized, firms will have to find new and innovative ways to mine data to stay ahead of the competition. Therefore, the quality of data could ultimately become more important than the speed of data. The key to gain a competitive advantage would be to optimize the speed and leverage the quality of the data.
Ultimately, for data quality, FSIs need a storage and analytics capability where they have consistent market data, reference data and analytics that are linked to applications throughout the trading cycle.
Market data the ‘fuel’ for algorithmic trading models
Algorithmic trading, the use of advanced mathematical models for making transaction decisions in the financial markets, has been on a steady rise over the last few years, more so in the US than other regions. Fragmentation of US markets has contributed to the development of algorithmic technology and techniques needed to navigate the markets, and regulatory initiatives such as Regulation NMS in the US and MiFID in Europe will certainly accelerate this adoption further.
In line with this steady growth, market data continues to be the ‘fuel’ for algorithmic models. As such, data is embedded in every aspect of the trade decision process, from pre-trade through to post-trade analysis.
As automated trading becomes increasingly cross-asset, so too must storage and analytic platforms to support cross-asset, next-generation trading. Although some within the industry would argue that this is straightforward, it is a complex process to provide these high-performance tools and meet the challenges of integrating the data.