The cost of data continues to be a key issue for participants on both the sell- and buy-side, leaving many participants disgruntled that they are paying too much for datasets that are essential to their future success.
Following years of canvassing from market participants, the UK’s Financial Conduct Authority launched a three-pronged investigation into competition and cost in the data markets in January 2022.
However, in spite of this ongoing probe into the data sphere, market data pricing has continued to rise “faster than ever” and far faster than the rate of inflation, new data by Substantive Research has found.
According to the firm’s most recent findings, the average price increase for an unchanged customer use case at ratings agencies is 12% – not including year-on-year inflation increases within a multiyear contract. For index providers, the average price increase for an unchanged customer use case is 13% also not including year-on-year inflation increases within a multiyear contract.
Some outlier providers are repricing clients by 600%, added Substantive Research.
“This provider power is creating a dynamic in which asset managers and banks are on an unsustainable path – it will become economically un-viable,” Substantive Research’s founder and chief executive officer, Mike Carrodus, told The TRADE.
“The backdrop of the regulatory scrutiny is not affecting the propensity of the most powerful providers to increase costs for identical use cases. There’s a lack of standardisation that people hide behind. This is a tough time. They [the buy- and sell-side] feel quite handcuffed to these providers. Pricing power is being exercised and this is an issue because there are tough choices being made across organisations at the moment.”
Read more – New research finds some buy-side firms paying 26 times more than others for index data as FCA investigation continues
The new data follows a similar study by Substantive Research in March, finding that some providers are charging certain buy-side firms 26 times more than others for similar index products and services.
The FCA’s data probe
The March findings followed the FCA’s conclusion of the first phase of its investigation in the same month, finding that competition in the wholesale data market is in fact not working. The UK watchdog confirmed in its findings that some trading markets are concentrated on too few firms, limiting choice for institutions and making switching suppliers difficult.
It also found that the process of procuring data – in particular the way it is sold – is too complex, again having the effect of limiting choices for investors when sourcing this essential data.
The FCA subsequently launched a new wholesale data market study under the Enterprise Act, inviting any persons wishing to make representations on the subject – most importantly on whether the subject should be submitted as a market investigation reference under the Act to the Competition Markets Authority – by 30 March. The regulator is expected to publish the findings and its chosen action by 1 March 2024.
Read more – Competition in wholesale data market is not working, first phase of FCA investigation finds
“Consumers of market data, on both the buy- and sell-side, are eagerly awaiting the results of this study which could have global ramifications, early next year,” said Substantive Research in a statement.
It is the second major probe into market data in recent years after the European Union also conducted a similar study which was concluded by the introduction of reasonable commercial bias.
The solution
The FCA’s study references six sections including barriers to entry, network effects, vertical integration, suppliers’ commercial practices, behaviour of data users and incentives of innovation.
For Carrodus, the core areas of concern are barriers to entry and suppliers’ commercial practices. Among the solutions aired in the last year is the concept of standardisation or the use of list pricing to reduce the disparity in fees charged to clients.
However, according to Carrodus, the current power dynamic within the data market has become dysfunctional to the point that firms – in particular smaller ones – fear higher prices in the event that providers decide to standardise across their client base.
“Clients believe standardisation could lead to higher prices. That doesn’t sound like a healthy functioning market. The words ‘list price’ exist in this market but it doesn’t actually exist. Our data shows that often in market data the amount of discounting that happens can be reasonably consistent. What’s not consistent is the original list price,” he said.
“Regulators say a transparent pricing model would be helpful. It doesn’t have to be perfectly adhered to. But market is missing any anchor whatsoever due to the lack of list prices. There’s only so much they [regulators] can do. Unless the dynamics were created by them and they can unwind them. But they haven’t in most cases – it’s brand power.”
On its current path, the increases to market data pricing will likely further concentrate flows into larger players able to shoulder the costs, accelerating existing consolidation taking place within most corners of the markets.