With the sheer volume available today, most hedge funds and private equity firms consume data from a variety of sources such as market data providers, counterparties, fund admins and accounting systems. As a result, data is processed and consumed at multiple touchpoints, which can lead to dataset duplication and quality issues. However, implementing a robust master data management platform will enable funds to mitigate these challenges while simultaneously providing efficiencies and tangible insights.In the realm of data consolidation and integration, a fund may want to look at their P&L by the GICS industry across various investments, in which case the P&L data would be provided by the accounting system while the industry data would be provided by the market data vendor. Here, both variations of the data would be housed separately, making the organization of such reports very difficult.
A master data management platform can solve the many issues associated with data inconsistency with its ability to integrate data from any number of sources, however; this data integration must be supported by a robust change management process to be truly useful. For example, there may be one fund admin today, two in the next year and four in the year after that. As a firm grows, so does the number of data sources, making seamless and quick integrations with new sources and the possession of pre-built adapters to various source systems an absolute must.
Due to a firm’s use of multiple data sources and providers, a master data management platform must also be capable of prioritizing and deduplicating. Data stewards should be able to ensure that the data integration is accurate, timely, consistent, accessible and checked for quality. For success, they require the added ability to govern the data across their various systems from a single place. In the example above, if there is an issue found within the shared data, a user should be able to govern it in one place rather than having to go to two different sources.
Throughout this process, the data stewards must ensure that the entire enterprise is discovering and consuming data from a centralized catalog. For example, while analyzing the holdings of a firm, the risk team should be looking at the same data as the portfolio managers. With a catalog that is built on top of a comprehensive data dictionary, different teams can be confident that they are speaking a common language when analyzing and implementing the firm’s data to generate insights.
It is important to remember that data integration and governance is not a one-way street. While examining data, different teams may have different questions as to how it was created. Because of this, data transparency, supported by a robust data lineage module, is critical for teams to maintain a level of confidence and accuracy.
Finally, after the data has gone through the above steps and is warehoused, it helps if there is an added reporting layer on top of it. This reporting layer will come with out-of-the-box reports, but it should also allow users to seamlessly build more reports using various tools.
In summary, a good master data management platform:
- Seamlessly integrates with multiple data sources and has a robust change management process
- Prioritizes and deduplicates the different sources
- Manages data governance from a single place
- Supports data discovery using an enterprise-level data catalog
- Ensures a common definition of the data
- Provides complete transparency into the data
- Allows the end user to seamlessly report on top of the mastered data