Collateral and Liquidity Data Management: the next big challenge for financial institutions

The problem is well known: financial institutions have data all over the place. Small institutions tend to face straight-forward challenges, while large ones must identify not only where data are hidden but how can it be aggregated without disrupting other processes. Thankfully, new advances in collateral and liquidity technology are ready to make solutions cost-effective and relatively painless to implement.

Imagine these scenarios that require data:

  • Regulators are mandating reporting that looks at all assets of a corporation, both on and off balance sheet, across every subsidiary and geography. How does a central reporting group collect the information?
  • Sales traders and their clients are cautious about balance sheet charges. How can a sales trader tell a client about the netting opportunities in a trade compared to existing holdings?
  • Large institutions have recently created central collateral funding desks. How can a trading division know what collateral is available internally to commit to a counterparty and how much it will cost?

These are all situations where data aggregation and management can play a pivotal role, saving substantial time and effort and opening doors to enhanced revenue opportunities.

The Four Vs of Collateral Management Data

The obstacles to effective collateral data management today begin with the sheer volume and dispersal of data around the world. This is in some ways a ‘Big Data’ problem, albeit with industry-specific twists.
We see four Vs at work in collateral and liquidity data management:

  • Volume – the volume of data that must be managed reaches the gigabytes and terabytes for any financial institution of at least moderate size. The bigger the institution, the greater still the volume of information that must be captured and analyzed.
  • Variety – collateral and liquidity data do not come standardized in a pre-packaged format. Instead, users must contend with multiple forms of data that ultimately get combined to provide the right report or picture for taking action. This can happen with both internal and external data sources.
  • Velocity – data move fast, and every new trade in financial markets means that something has changed in an institution’s holdings, whether the value of stocks owned, the need for a collateral call or the credit limit of a counterparty.
  • Veracity – its great to have all data in one place but how can users be sure that the data are accurate? Users need a way to verify the integrity of data across the enterprise.



Existing Solutions

While institutions have largely solved these problems for single business or legal entities in one legal jurisdiction, the problem is not close to being solved once the boundaries get beyond this limited scope. For example, getting US OTC derivatives to communicate with UK secured funding across different IT systems and countries can be difficult in silos, let alone ensuring that technology solutions work together.

The financial markets industry has recognized the difficulty of collateral management and is supporting initiatives and utilities meant to solve the problem. DTCC-Euroclear GlobalCollateral Ltd is launching the Margin Transit Utility (MTU), which aims to aggregate a firm’s holdings across all custodians and Central Securities Depositories. This is a great start, but even if all market participants and depositories agree to connect to the MTU, firms will need to integrate this information internally and feedback information externally. That will require a significant work effort across the board and even in the best-case scenario will take time.

Most technology providers also have excellent solutions for calculating data and managing positions but rely on the client to already deliver data internally. This is the same data problem once again: even the best collateral management system is made less effective by incomplete, unreliable data inputs. So, technology solutions need to evolve that allow connecting and harmonizing data across multiple silos more easily and without requiring major multi-year re-engineering efforts.

Case Study: Recovery and Resolution Reporting

While the problems inherent in daily trading operations are readily understood, the importance of collateral and liquidity data management grows even larger when considering regulatory reporting requirements. In one example, the Federal Reserve’s SR14-1 recovery and resolution plan reporting processes for banks highlights the critical need for robust data management. According to a January 24, 2014 supervisory letter, the eight largest US banks should have:

  • Effective processes for managing, identifying, and valuing collateral it receives from and posts to external parties and affiliates;
    A comprehensive understanding of obligations and exposures associated with payment, clearing, and settlement activities;
  • The ability to analyze funding sources, uses, and risks of each material entity and critical operation, including how these entities and operations may be affected under stress;
  • Demonstrated management information systems capabilities for producing certain key data on a legal entity basis that is readily retrievable, with controls in place to ensure data integrity and reliability; and
  • Robust arrangements in place for the continued provision of shared or outsourced services needed to maintain critical operations that are documented and supported by legal and operational frameworks.

Four out of five of these bullet points speak directly to data management. There can really be no question: it is not only a good business practice for banks to have active collateral and liquidity data management problems, it is also a legal requirement under SR14-1.

Case Study: Central Collateral Trading Desks

As collateral visibility, management and optimization have grown in importance due to regulatory and/or economic pressures, many large financial institutions are setting up central collateral trading desks/functions. Trading collateral has always been a fundamental part of dealer business but is usually done in silos such as repos, sec lending, OTC derivatives, prime brokerage, etc. The challenge of this new direction is that profitability has not grown at the same pace, which means that these desks may not have sufficient investments to build requisite analytics and technology. In addition, the centralization of bank services across operations and technology means that the needs of specific collateral types may get ignored in the event of a major technology renovation project.

A simple yet innovative solution to this problem is technology that serves as connectivity across all collateralized trading desks whether merged or in silos. Connectivity to repo, securities lending, OTC margining, futures, prime brokerage and other collateral-related business lines is critical to understanding both the big picture and the contributions of each business unit. By establishing this connectivity, firms can avoid major technology rebuilds or installs that may affect every trading desk in favor of middleware that provides data management as well as decision support across the organization.

By connecting all trading desks while leaving their product-specific technologies alone, firms can create a mechanism where data and analytics flow up to trading desks while decisions and actions flow down into the firm’s aggregate data pool. This creates a sizeable advantage for firms wanting to optimize their collateral trading activities while avoiding the cost and headache of a major technology project to harmonize platforms for data management.

The Transcend Street Solution

We at Transcend Street Solutions have considered the data problem across multiple large financial institutions in a new way. Many technology vendors seek to be the golden source of all data. We do not. Instead, we want to connect to every golden source of data where it stands now. This asks a financial institution to provide access to data and not replace existing warehouses or infrastructure. Our first solution, CoSMOS, collates, harmonizes, mines and analyzes all valuable information across enterprise-wide systems in real-time. We then feed those data into platforms for business user decision making, including regulatory reporting, internal applications and third party collateral management systems. By acting as an overlay, our goal is to quickly get the data out of storage and into a useful, actionable format.

Once the process of collateral and liquidity data aggregation is complete throughout the global organization and across business units, there are a wide variety of applications that can be brought to bear. We see regulatory reporting, insight on collateral agreements, funding and position management, margin dashboard management and liquidity analytics as starting places. We expect that the collateral and liquidity space will evolve to require additional services.

Processing data for collateral and liquidity management is not an insurmountable task but it does take work. Many firms have only loose ideas about where every source of information is located internally across business units and geographies. But focusing on internal data aggregation enables a large number of other processes, reporting and technologies to function with maximum efficiency. The data problem is well-known: now solutions are appearing that confront the challenge in new ways.

This article was originally published on Securities Finance Monitor.