Yambina (the name derives from an aboriginal dialect) is a specialist company that provides a rules-based data management engine that centralises data from multiple sources within (trading, accounting, settlement) and without (LEI providers, data vendors, fund administrators and prime brokers) a fund management firm. The company also provides tools that enables a variety of users to access the data for internal trading and risk management purposes as well to produce regulatory reports. Dominic Hobson spoke to co-founder David Bilbé.

Hobson: Regulatory reporting is a vast data gathering, cleansing, validation and presentation issue. How are fund managers fulfilling that task today, and what are the problems they face?

Bilbé: There is more and more demand from regulators to provide information and reports in the form and substance which they demand. Fund managers have issues of multiple data sources, including inhouse spreadsheets, cost of data acquisition from providers, data provenance challenges and reconciliation of the myriad of parameters set by regulators to their existing business processes. This is placing a high dependency on consultancy services, leading to slow reaction times and an imperfect fit with regulatory requirements, let alone user and business requirements. Data has to be gathered, cleansed, validated and formatted in such a way as meets the demands of all interested parties - not just the regulators. This is an expensive challenge. Current tools are expensive and lacking the requisite responsiveness to the changing needs of the business. You need multiple sources of data just to validate the quality of one stream of data.

Hobson: What are the advantages of the present approach?

Bilbé: The advantage of the current approach is that the industry has placed data management on a more strategic footing, and will increasingly do so. The demands of regulatory reporting in particular are leading to better risk management. Legal entity identifiers (LEIs), for example, are a direct result of the lack of accurate counterparty information in 2007-08. But there needs to be an acceleration of standardisation.It is more straightforward for those organisations which have invested in data management and have the budget and staffing to do so. They can better control inputs and outputs and reconcile with the business process.

Hobson: What are the defects of the present approach?

Bilbé: It is not just about satisfying regulators - it is about getting a complete view across the asset classes. There is at present a lack of available tools to link business process and data effectively without engaging in expensive and cumbersome consultancy projects. Yet managers need a complete view across all asset classes and a linkage to provenance, feed licence and, ultimately, the ability to apportion the cost of compliance accurately to clients when they send them invoices. Speed and accuracy, combined with provenance information linked to existing business processes, is the key to securing a permanent strategic advantage from what is at present a compliance driven activity. The current approach could be argued to be too slow and too costly, but that reflects the lack of standardisation. It is a difficult balance to ensure the needs of the business users and regulators are fulfilled without unnecessary duplication and therefore cost. The cost of rule change is currently much too high, and the level of dependency on IT departments to make those changes increases the lead-time.

Hobson: How can it be done better?

Bilbé: Moving towards the products and services offered by firms like ours will make a real difference. Our approach places more control in the hands of the users, rather than the data professionals alone, to define what they want, get access to cleandata and put reports together for statutory or ad hoc purposes quickly and efficiently. This approach can transform the cost dynamics of the business. The alternative is to increase budget and invest more heavily in consultancy services or in-house data management, but ultimately that will not deliver any cost advantages. In addition, any failure in manage data accurately could have serious reputational risk implications.

Hobson: Can it be done cheaper?

Bilbé: Yes. Using open architecture, technology tools to align the needs of the business user with a rules-based engine, will increase speed and cut cost. But cost-cutting is not just about a lower absolute cost. It also means reducing relative cost and avoiding the reputational cost of reporting incorrectly, or delivering inaccurate data to clients or regulators. This is an ongoing challenge to all managers. Centralised control, good exception management, adapting to new data requirements quickly, plus control over the golden copy of data, all reduce cost. We can help you do that.

Hobson: How do you guarantee consistency in your data?

Bilbé: Consistency is guaranteed by comparing at least two sources of data and providing a full audit and data clean-up, as defined by the user. This is a common but not the only method. Clients may have preferences for particular data sources depending on asset type, data provider or other factors, and this has to be built into the service offering. Dealing with exceptions efficiently and quickly reduces anomalies and risks.

Hobson: Is there ever a commercial upside to getting on top of your data?

Bilbé: Yes. Because data drives every decision, it needs to be accurate, available and complete. Failure in any component increases reputational risk and cost. Data which is cleansed and accurate can drive reporting requirements and ultimately be driven into asset management algorithms. By providing an aggregated view of data across asset classes you can derive useful information that mitigates your exposures to risk.

Hobson: What data failings encourage regulators to pay a site visit?

Bilbé: Lack of compliance with requirements, inaccurate data, failure to report, and failure to meet reporting deadlines. To avoid those risks, investment in data management is not, in itself, enough. Regulators are now demanding more information on a timely basis. That it because they want to avoid the political implications of a market meltdown they failed to foresee. In their view, that entails ensuring prudent risk management standards are in place at every individual company.

Hobson: Can (and should) the job be outsourced completely?

Bilbé: No. Users should take control of their own data with good processes and tools. Hire data professionals who know what the users and regulators want, and who can control the speed of delivery without excessive dependency on consultants. On the other hand, outsourcing the cleansing of data which does not have an impact on critical functions can be beneficial to cost management.

Hobson: What is revolutionary about the approach you propose?

Bilbé: Our approach is fundamentally different from competitors in the field. We aim to overcome the data fragmentation which occurs even in smaller companies by centralising data, and putting even redundant data from older sources into one clear and clean cross asset class database. We integrate real time data, data couched in the FIX and FpML message formats, integrate it across asset classes, isolate suspect data and conduct a full audit of data quality. Our use of the openMDDB industry database reduces dependence on particular vendors and suppliers. Our rules engine can support all types of business processes and information requirements. In fact, our tools aim to place control of rule definition in the hands of the users and their data professionals.

In short, we offer speed, accuracy and an open architecture that gives users and data professionals a high degree of control over their own data , see past talk based on what they should do and actually look to the management information for evidence that there is appropriate identification and escalation of risk.