Machine readable data utilising data standards was developed many years ago. XML based technology, which is now beginning to take over from structured data standards, should be changing the supply chain. Enabling huge reductions in cost for the users, but more importantly for the investors. Yet despite successful efforts to introduce technology solutions and the widespread use of standards, we see no change in the supply chain, and little or no cost reductions. Is the data supply chain a monopoly in financial markets? If so why has been allowed to persist unchallenged?
The answer probably starts with the makeup of the supply chain itself. Issuers of data have time-honoured relationships with agents to distribute into financial markets, through market participants and beyond to investors and their agents. So, in the case of Corporate data, the Issuer will have a long standing and established distribution process. This may be assisted by their advisor the Investment Bank, or a PR company who utilise their newswire connections and of course the relevant Stock Exchanges.
All these players in the supply chain have data or data services they can sell. Thus, a single element related to a corporate announcement of some kind, creates multiple copies of the same data. This is detrimental, as multiple forms of the same data tends to create the risk of “Chinese Whispers” further down the chain. Data vendors will access multiple sources of the same data, clean, then typically enhance it, to incorporate all information and aspects. Each data vendor operates their own systems and have their own processes for qualifying the data, to ensure the fullest and most accurate presentation. They are trying to do a good job but charge highly for their services.
Historically, the Stock Exchange has a pivotal position in the management and distribution of timely and accurate data. They do an excellent job and are successful, as can be seen from their annual financial results. So why do we need Data vendors doing much the same thing? Could it be that multiple data distributers are creating choice and competition and the expectation of lower costs? If so, this does not appear to be working well as an objective. Users of data often purchase from more than one vendor, then have added internal costs, to find the “Golden Copy”, with some banks buying data five or six times from different suppliers for completeness and accuracy. The cost of course is passed onto investors in a bundled charge. Should the regulators insist on unbundling this cost in the charges?
Protectionism is clearly in play, with large powerful corporates throughout the supply chain having significant political lobbying capabilities, to ensure the opportunities to introduce change are negated and so inertia persists. None of this is a secret! Indeed, the whole industry is fully aware, yet appears powerless to force changes that are long overdue. Politicians and Regulators could be a source and power for change, to bring efficiencies and cost reductions, which investors and in fact all of society should expect. Yet despite the regular reviews, reports, and investigations nothing changes. What is required is the introduction of a game changing business that revolutionises the market, virtually overnight. An APP could be the best and fastest way to change this ancient financial monopoly. Where will it come from? Watch this space!