• wblogo
  • wblogo
  • wblogo

Why banks should think twice before diving head first into the data pool

Charlie Browne, GoldenSource, Head of Market & Risk Data Solutions, London, 5 April 2019

articleimage

The Fundamental Review of the Trading Book or FRTB is due to come into force between January 2022 and January 2027, depending on the national regulator in question. Exchange-facing private banks are going to have to compile mountains of extra data to meet the new requirements.

The FRTB will contain new rules that ought to make up for the shortcomings of Basel 2.5, which failed to solve some important problems related to market risk.

As the industry prepares to meet this next wave of wide-ranging regulation, much debate has revolved around the use of data pooling to alleviate brand new requirements regarding non-modellable risk factors.

For the first time, the FRTB is going to require banks to prove that risk factors trade by retrieving real prices. This is a mammoth undertaking and the question of where banks are going to get this data is one of the compelling arguments for data pooling. The idea is that banks, data vendors, exchanges and trade repositories can combine all their data to take in a useful number of transactions.

It is a convincing proposition. Banks simply do not have enough of their own data. Add to this the fact that data is expensive and that many firms are keen to consolidate costs after several heavy years of regulatory requirement, and the attraction of data pooling is clear.

At this stage it is difficult to predict the efficacy of doing this. Is a single vendor likely to become a one-stop shop, or will the ever-secretive banks be reluctant to rely on one source of information and instead enlist many?

Then there is the question of who will be responsible for working out whether a risk factor is modellable or not. Though we are still a couple of years away from reform, early indications show that many banks may not want to rely on the data pool for these calculations, preferring instead to use their own methods and processes.

There are other drawbacks. Under the stringencies of FRTB the regulator may require banks to show it, several years later, the places where they obtained the pricing data when trying to determine that this-or-that risk factor was modellable. If this information came from a data pool, will the data pool be able to provide the necessary information to the auditors? In other words, if a data pool says that a risk factor is modellable, will it then have the capacity, and accept the responsibility, to fend off any tough questions from the regulator further down the line?

Although the logistical framework surrounding non modellable risk factors and the Risk Factor Eligibility Test (RFET) are important hurdles to overcome, banks should be wary about ploughing too much time into solving this small part of a much wider reaching set of rules.

The temptation to concentrate on tackling the RFET probably lies in the fact that it is the only part of FRTB that is completely unprecedented. However, banks should avoid expending all their resources on it and instead think how the whole of the FRTB might benefit their data strategies.

Because the guidelines are so wide-ranging, a bank that evolves the right strategy for its data in the context of the FRTB will automatically fulfil the data-related requirements of a lot of other regulations (e.g. BCBS 239, Prudential Valuations and the Comprehensive Capital Analysis Review). This is a massive opportunity for firms to evaluate their entire data-related infrastructures and ensure that they are taking a holistic approach to regulation rather than addressing different directives in silos. The last few years have seen a 'bolt on' approach to regulation with compliance teams trying to obey different regulations in different ways and at different times.

As with any new regulation, the temptation with FRTB is for banks to concentrate on the unknown. This is why people talk so much about data pools as a solution for non-modellable risk factors, but firms that spend too much time and money solving this one problem could be missing a trick. In many ways, the FRTB is an opportunity for compliance teams to take a step back, take stock and put together a comprehensive data strategy that protects them against many regulatory requirements.

Latest Comment and Analysis

Latest News

Award Winners

Most Read

More Stories

Latest Poll