Banks that only debate FRTB calculation methods are missing the point
Tim Versteeg, NeoXam, APAC managing director, Paris, 18 February 2020
With the European Union's Fundamental Review of the Trading Book looming, financial institutions are starting to set more and more resources aside for compliance projects to do with the regulation.
The FRTB is a comprehensive suite of capital rules developed by the Basel Committee for Banking Supervision as part of its 'Basel III' set of reforms. As private banks trade on the wholesale markets, their compliance officers ought to be at least aware of this reform, which originally called itself the Minimum Capital Requirements for Market Risk. After some delays, the time for implementation has now been set for 2022. Many of the preparations for this involve banks balancing their capital between the standardised approach (SA) and the internal models approach (IMA).
Small banks, large banks
For smaller banks, the costs of observing new infrastructural requirements that pertain to the IMA in terms of systems, data and processes cannot be justified when compared with the lower capital requirements that these banks could achieve with the SA. The SA calculations also have some synergies with risk calculations for other regulations, such as the upcoming Initial Margin (IM) requirements. It therefore makes sense for institutions that are going to be pulled into the upcoming tranche next September to include SA calculations in their preparations.
Banks with larger trading books which include instruments or assets with significant price variations, however, will benefit when they switch to an internal models approach.
Although it is more costly, an approved IMA model ensures that the capital adequacy reserves that a firm needs to put aside are going to be lower, so a greater investment is worthwhile in the long run. Every bank that is caught up in these discussions can easily forget something important: its calculations are only as good as its data.
Data policies
If the bank does not have the correct data-handling processes in place, the change from just managing the classic one years’ worth of historical data to handling a decade's could be extremely painful and costly from an operational perspective. This means that is is time for firms to get their 'data houses' in order.
A good first step is to make sure that everything is bucketed correctly. Regardless of whether the bank is choosing SA or IMA, it needs to make sure that the data that backs up its calculations is reliable and timely.
Automation is also vital. It is very time-consuming to handle the data manually - the bank has to make up rules to identify the proxy, find out if a price is valid and, if not, to go to alternative sources. This might be manageable for a years’ worth of historical data, bit it becomes nigh-on impossible if the bank has to cope with ten years.
It is not just a case of difficult calculations. Regulators have to enforce the Basel Committee's standard no 239, entitled "Principles for effective risk data aggregation and risk reporting." More and more, these regulators are coming down hard on firms that leave their compliance open to human error. With the much larger data sets that the FRTB envisages, even semi-automated processes will have to be automated.
A freeing-up of resources
It is not all bad news. Banks that can evolve the right processes could benefit from more than just the avoidance of compliance-related headaches. The act of automating data allows employees who would have previously been waiting for batch processes to finish to work out how to get the most revenue out of the huge amounts of data that are now available to them.
Having a solid data foundation and an automated data process in place will not just help with the FRTB. Calculations in line with other regulations and any ad-hoc regulatory stress tests will be easier and will only take a small amount of time. This is true regardless of whether a bank choose SA or IMA.
* Tim Versteeg can be reached on +33 1 49 11 30 00.