As firms wrestle with the opposing forces of reducing spend while increasing the performance of market data service to the enterprise, the reluctance to adopt an aggressive strategy to tackle the volatile market may be costing you more than you can afford, says Frank Piasecki, president and co-founder, Activ Financial
All financial services industry survivors share the goal of getting more out of IT with less initial and ongoing investment. And nowhere is the pressure to lower costs higher profile than in the market data space. For many firms, market data expenses are second only to personnel costs. In today's environment, firms are struggling to justify that proportion of investment in delivering data, and are now considering heretofore unimaginable strategies to rein in legacy system dependency built up over several market cycles.
The problem with these new strategies is that real change comes hand-in-hand with switching costs and hard-to-measure implementation risks, both of which serve to entrench traditional vendors that take advantage of the embedded nature of their service through non-competitive pricing and abusive upgrade paths.
For those charged with managing market data at firms recasting their overall business model and expenditures, something must be done to cut market data costs and support flexibility in a market where the only constant is change. Ripping out critical enterprise systems with poorly understood dependencies is not a pleasant option, so many projects stop before they begin. But with deeper understanding and quantifiable investment return benefits, there are new approaches and tactics for managing change now, as well as the inevitable change down the road. Thinking through that process and starting down the road for change should be every market data manager's prime directive. The alternative may impact the firm's overall ability to survive.
For all the continuous change in the financial markets, the constants in delivering market data are new content requirements, message rate growth, regulation and compliance, broader geographic scope, and a growth in functional requirements to serve ever-broader use cases that are also more physically dispersed than ever before-over trading floors, remote offices, exchange co-lo's, mobile devices, extranets, public clouds, and so on. There are new types of data from within and outside the organization, required by customers in and outside the firm, by regulators, and new actors in the investment process demanding transparency within verifiable and defensible platforms.
The main issue with changing how firms receive and use data is that many of the traditional market data players are not motivated to solve this problem aggressively. Updating their infrastructures and technology to accommodate easy upgrades often requires customer risk and of course, significant capital investments in R&D on a continuous basis. Your incumbent vendor is motivated only by the risk of losing business.
Looking back over the progression of market data services-from dumb video screens underpinned with a model based on centralized systems with significant infrastructure requirements for the provider and users-you can see how far we have come. Today, we see market data use cases that are now also driven by co-location of market participants at liquidity venues, which has greatly reduced the costs tied to content distribution, and enabled providers to take over much of the management of the underlying infrastructure.
We call this market-data-as-a-service, an alternative model that can reduce overall market data costs by 50 percent. Today's market-data-as-a service models are characterized by global content management and data delivery. Think of market data as a utility, like your water supply. You don't build and manage the pipes that deliver water to your home, but you rely on it working if you pay your monthly bill. This pay-as-you-go model serves the financial services industry well, because the onus is on the service provider to invest in a fully integrated, managed service, freeing up the customer from messy upgrades and glitches.
In today's climate, global institutions can find it difficult to justify switching costs and changing consumption behaviors, either due to simple inertia or a misguided belief that it will incur increased expense and risk. How can IT and business leaders respond if they know they need to make a switch but are concerned about risk and fear of the unknown?
When looking into a market data solution, it's vital to both identify cost savings upfront and understand the business case for making a change. A solution that, for example, utilizes co-location facilities for market data, execution, and hosting trading and risk applications can improve not only trade performance but also cost. The removal of the required corporate firewall alone might justify the conversion. There's also the reduction in network and server costs, as well as data and labor costs.
The short-term view of legacy infrastructure investments can also sometimes slow the adoption of more efficient market data models. Firms that have invested in traditional vendor interfaces such as APIs, or have built internal infrastructure to support integration with outdated vendor models are often reluctant to pull out and retire that plumbing. In those instances, firms should scrutinize the ongoing costs associated with older market data models-including both the ongoing costs of infrastructure management and the implicit costs of sub-optimal data delivery inherent in outdated models.
Being able to retire operational costs while improving performance is a compelling call to action, once firms analyze the total cost of their current market data infrastructure. Replacing an in-house data infrastructure with an outsourced market data solution also shifts the capital risk from the firm to the market data provider, who is better suited to manage growing tick rates and network costs.
Across the vast majority of IT today, outsourcing a commoditized service offering is in an enterprise's best interest. Sure, a firm can buy a feed handler and normalize data, but can they do that across asset classes and in every market around the globe? Probably not.
Some firms who will invest significant amounts of capital to shave a few microseconds from their trading strategy. But opportunities for firms trading purely on speed are diminishing. And for most firms, the return on investment to gain a microsecond advantage doesn't justify the cost.
Understanding your firm's requirements is the first step to adopting a market data model that will deliver efficiencies-whether it's higher performance, lower costs, or somewhere in between. Firms need to ensure they identify areas to improve businesses performance, as efficiencies achieved today will impact tomorrow's profitability.