Viewpoint : Data Management : Stephen Engdahl

DATA MANAGEMENT TO IMPROVE RISK MANAGEMENT.

 GoldenSource_Steve-Engdahl

Stephen Engdahl, Senior Vice President, Product Strategy, GoldenSource

Risk analysis is not new. Portfolio management and investment decision-making have been driven by risk analysis since at least the middle part of the last century. Our industry is built around techniques to achieve maximum return with minimal (or optimal) risk. And the risk discipline requires copious amounts of timely, high quality, relevant data. Risk models are only as good as the information which is fed into them.

Institutions now recognize the importance of good data management practices in order to improve their risk practices, and more than ever before they are prepared to do something about it. Risk analysis is a common driver of many active GoldenSource client projects focusing on data management, data quality, and data governance. But if risk analysis is such an established practice, then why are so many institutions just now focusing on data as it relates to risk? What changed?

Navigating an uncertain world

The markets have taught us many lessons over the last five years, including the importance of expecting the unexpected. Market and economic events relevant to risk management, i.e. counterparty defaults, market volatility, mortgage crises, and sovereign debt restructurings, stressed the data management practices of most organizations.

Consider even the relatively simple task of measuring exposure to various dimensions – what was your exposure to a failed counterparty, including all its subsidiaries and related entities? To securities linked to subprime mortgages? To Greece? Now, imagine answering these questions in an environment where positions are recorded in multiple systems, counterparty hierarchy is not well defined, and critical reference data attributes exist in spreadsheets or separate incompatible security masters, not linked to counterparty and position data. What should be a simple task becomes a significant problem unless the right data infrastructure is in place.

Easy access to accurate risk data cannot be ignored, because we live in a volatile world. “100-year floods” are supposed to happen no more often than once every 100 years, but now they’re occurring with alarming frequency. In the financial markets, the same goes for formerly “unthinkable” events such as major institution and sovereign crises.

This situation is particularly stressful without a solid data foundation which aggregates and standardizes internal position and transaction data, and links it with clean counterparty, hierarchy, instrument and underlying data so that measurement and analysis can proceed. After facing a series of external events and realizing that answering critical questions took far too long, institutions are recognizing that the time to aggregate data, standardize it, and monitor it for quality is before the next question arises, not when it arises.

More regulation, please?

Multiple jurisdictions worldwide are asking for more data to support the decisions made and positions taken by financial institutions. Centrally clearing derivatives changes the risk profile of these instruments. Certain trade reporting must be enriched with Legal Entity Identifiers (LEI’s) and Unique Product Identifiers (UPI’s). Solvency II requires much more granular data, in a broader set of attributes, than was needed in the past – touching on all aspects of data from position to instrument to counterparty.

The strategic reason for a firm to invest in data management capabilities should always be to improve quality and improve its own decision making and in turn its own investment performance. However, regulation is providing an additional kick to get things right, today.

Making order out of chaos

Prompted both by regulatory requirements and market events, risk managers, portfolio managers, and other consumers of information within the firm are now asking more questions about the lineage of the data they use.  In addition to demanding better data quality, the business is demanding a view into the data supply chain. This is particularly important in areas such as valuations, where the business needs to show why a particular valuation was applied to a thinly traded instrument, and what other sources and models for valuation were provided.

Transparency requires data management operations to run on systems with full audit trails, entitlements control, and easy access points for end users, rather than spreadsheets. It also requires the data lifecycle within an organization (and the governance processes which surround and support it) to be clearly defined and documented. Every set of data attributes has a data steward somewhere within the organization who knows the most about how that data behaves, and the business needs a simple way of finding that person when it has questions about the attributes.

The path forward

The risk discipline is facing new scrutiny as firms steel themselves for the present and ready themselves to compete in the future. And integral to good risk management is sound data management policy, practices, and systems. Given the current state of data management in many firms, it may seem like a daunting task to get to the end state of clean, consolidated, complete data to drive effective risk analysis.

Luckily, that does not have to be the case. Leading institutions that have made strides in this area share the following three characteristics:

1.     Define success criteria around sets of data consumers

Data management initiatives achieve their best and quickest results when they focus on delivering value to a particular region, a department, an application, or a business function. This ensures that each project phase provides ROI, and sets the stage to iteratively increase ROI in subsequent phases. It’s important to have a solid end-state vision, but equally important to achieve it in incremental steps.

2.     Don’t forget the data generated within the firm

Data governance and quality processes apply to all data assets which need to be aggregated and leveraged across the enterprise. Many data projects only consider the security master data, which is procured from external vendors. However, that is only part of the picture. A solid end-state vision must address counterparties, accounts, and associated hierarchies. It must also include positions and transactions. The quality and standardization of such information is critically important to risk, regulation, and gaining a 360-degree view of the business.

3.     Be selfish – identify strategic benefits from regulatory projects

Regulation is often viewed as something to do “because we have to.” But data aggregation, standardization, quality management, and transparency required by many regulatory initiatives can also be used for competitive gain. Leading institutions derive strategic gains from their regulatory initiatives, such as finding ways to get to know their customers better, operate more efficiently, create new financial products more quickly, and improve their investment performance.

Data management has reached mass-adoption. Earlier adopters have paved the way and provided all of us with lessons learned about how to get data management right. Now, given market uncertainty, regulatory requirements, and the drive for business transparency, it’s a perfect time to create your own data management roadmap.

©BestExecution 2013

Related Articles

Latest Articles