fbpx

How TRIM influences the IT organization & regulatory compliance with these 4 tips

A little over 2 months ago, the EBA published its findings from the targeted review of internal models (TRIM). The message it brings is difficult to ignore. Banks which formed part of the TRIM exercise largely underestimated the risks, and thus underestimated their capital requirements severely. Resulting in an overall increase of 275 billion euro in aggregated RWA’s for financial institutions (FI’s) being part of the exercise. At the basis of this underestimation lie poor internal model data management and overall deficiencies in the model management itself.

In the wake of the TRIM results, Basel IV is presenting itself on the regulatory horizon, bringing with it what is possibly the biggest change for IRB banks since Basel II. Not only does Basel IV restrict the usage of internal models for certain exposure classes, it also introduces for the first time a floor to the IRB models, based on a revised standardized approach. Looking at the scale of the problem which TRIM brought to the surface, it is not surprising that the EBA has taken a stronger interest in the internal models which have been used at FI’s. An interest which doesn’t seem like it will go away in the years to come. On the positive side, the recommendations which flow from TRIM could serve as a front load for the adoption of the new Basel IV regulation.

Going one step further, what does all this mean for FI’s and their IT organization? How and where should they invest to ensure compliance with regulatory requirements on internal models? What are key features which regulatory applications (legacy or new) will need to have to be future proof in this new era:

1. Consistent data management & governance:

Risk data is often collected from various source systems using archaic data integration tools that offer little data management capabilities to keep up with regulatory and internal demands in terms of transparency & governance. Risk data is often collected from various source systems using archaic data integration tools that offer little data management capabilities to keep up with regulatory and internal demands in terms of transparency & governance. A full UI driven ETL offering the user the full overview of the data flow from source to report and allows the risk users themselves to work with the data has become a basic necessity that is perceived as lacking more often than not. Data governance has long been a slogan but has become a reality in an ever-changing world.

2. Strong model management features:

Designing and calibrating models is an art, which is performed by gifted modelers in financial institutions. The data analytics tools used are often very powerful, but they are what they are, i.e. tools. All too often huge effort and money is invested in constructing the models, with the model execution process being merely an afterthought. The model itself is conceived as an executable script that was never transformed into a useable (model) solution. This has resulted in a clear lack of model management capabilities, which becomes ever more pressing in turbulent times. As with the data management, the risk user needs to be empowered to work with, change, version and publish models, all governed by an overarching governance process. Those are the necessary ingredients to support the immediate regulatory demands as well as future real life demands for risk departments.

3. A flexible framework for model monitoring & model validation:

Model monitoring combines the key stats of the model after each model execution and is often conducted by either the model development team or the risk users.  Whomever monitors the models should have a full understanding of the performance of the model in terms of overall distribution outcomes, impacts of data deficiencies (model variables) and impacts of the outcomes (risk factors). Model monitoring also requires the capability to drill through to the individual input parameters on a counterparty or account level to check data completeness and accuracy.

Internal Model validation is then the independent challenge of the underlying model methodologies and a review of the analysis of outcomes. Model validation is often considered as a periodic (often yearly) and mostly manual exercise by the validation team that operates independently from the model development team. Model validation is an ongoing process well embedded in the risk departments operations. On top of the data management and model monitoring features described above, internal model validation requires a sandbox approach to run, benchmark and backtest models. TRIM indicates the model validation as one area where progress is being made, but that remains a challenge for financial institutions.

4. Performance:

Flexibility will be key for applications, having the ability to easily run and re-run different scenarios for the calibration of internal models. Not only will applications need the flexibility to do this in a user friendly way, they will also need to be able to run them quickly. Performance is the name of the game, as it will provide the FI with a competitive advantage as it allows them to act and adapt more quickly.


The outcome of TRIM is a real wake-up call for all FI’s that are using internal models. The sheer number of findings is surely a concern, but the nature of the findings is even more worrying. For example, in the most commonly used type of internal models, e.g. PD models for retail and SME portfolios, TRIM issued findings in more than half of the models concerning the low differentiation owing to low discriminatory power of the scoring function. This in itself means that the modelers need to go back to the drawing board.

For the low-default portfolios, Basel IV  simply restricts/prohibits the use of models, ever since regulators realized that they performed poorly in stress tests and that there was a lack of consistency on overall model outcomes. Stress testing however is becoming the new normal as a way for FI’s to cope with the more rapid changing nature of the world we live in (cfr COVID). So it’ safe to assume that the focus on internal models from a regulatory perspective will only increase.

FI’s thus are faced with significant investments to be made in order to counter the findings from TRIM and the overall focus on internal models. Investments in the years to come will need to be made in:

  • Future proofing IT infrastructure with a focus on flexibility and performance. Allowing to support applications in scenario analysis and overall calculation performance. FI’s quants focus should be on model construction & calibration whilst performing the actual calculations should be positioned in best of break dedicated applications.
  • Putting in place a centralized model management framework to support the above. Providing a transparent view on model versions, approval cycles,… Supporting the stressing of current models, subsequent adaption and deployment whilst providing full traceability and transparency.
  • Focusing on clean and consistent data at the center of it all as failure to adhere to this will unavoidably lead to violations of the standardized floor restricting the usage of internal models in which so much money and resources have been invested.

Both TRIM and Basel IV show us that the future will not only bring about a shift in the way FI’s calculate but also on the tools used and the surrounding processes. This means that the new wave will be as much about IT change as it will be about regulatory change.

Comments are closed.