Pricing model change needed post-Libor
With massive regulatory changes ahead for the financial services sector with regards to pricing, many operators are still looking to adapt to the situation with an outdated siloed approach. According to Delta Capita COO Philip Freeborn, organisations would be better advised to look to apply a holistic, platform based way of working moving forward.
IBOR systems, most prominently known for LIBOR (London Interbank Offered Rate), have been the globally accepted benchmark of interest rates that indicate borrowing costs between banks for major currencies and tenors. However, the benchmark has increasingly lost validity thanks to a rigging scandal that disrupted the City of London, and because the market effectively shut down through the financial crisis.
Due to these changes, a high priority market-level regulatory alteration is now on the cards. Due to take effect from January 2022 in most markets, Libor replacement will occur alongside the first phase of the perennial Fundamental Review of the Trading Book (FRTB). With just over a year to go, these two regulatory changes alone can drive a large volume of remediation into existing pricing and risk models – something which, according to Delta Capita Group CIO and COO Philip Freeborn, is proving difficult to act on for most financial organisations.
In an article on the consultancy’s website, Freeborn explained, “The problem is that most of the traditional pricing and risk models are built on outdated and fragmented architecture. This means that they are very expensive and time consuming to amend. Entirely new technology is not just recommended to comply with FRTB and similar upcoming regulations but required for an efficient and cost effective Pricing and Risk infrastructure.”
According to the Delta Capita expert, pricing and risk technology in particular will be essential to financial institutions as the look to evolve. Traditional pricing and risk models are designed for a run in the siloes of desks, or asset classes, and for specific use cases. With the multifaceted nature of the coming changes, however, Freeborn anticipated that pricing models will need to move beyond this.
He explained, “IBOR and similar colossal changes to financial services like FRTB encourage the much-needed changes in pricing and risk technology. But it will also be essential to stay alert and ready for any other regulation changes. That is one of the reasons having a single pricing and risk platform capable of handling all assets classes and use cases is essential. New scripts can easily be integrated to adjust for any future changes to the pricing and risk process and algorithm. This kind of elegant architecture allows for future growth and gives you more control over the process—all without sacrificing more time, money, or manpower.”
New model
Illustrating the way such a holistic platform could work, Freeborn suggested instead of having a “massive model library,” each with different parameters and functions, financial organisations could move to a model with two primary components: a modelling environment – includes templates for models, product payoffs, and risk factors, along with a standard way to calibrate – and a single numerical solver. In this environment, each model can be executed with a single numerical solver. The compilation can be targeted and optimised for a variety of hardware-accelerated infrastructures, ensuring efficient and fast execution and the ability to scale to very large portfolios.
In order to prepare for the impending end to IBOR, Freeborn suggested that such a platform could be built alongside existing solutions, and run in parallel to benchmark its results. At the same time, by automating part of the process, by having executable code written by a compiler, reducing error and increasing consistency – freeing up analysts to focus their time on the value-adding modelling and results analysis, rather than rebuilding the mathematical execution for every model by hand.
He added, “You can start a process of migration away from the vast model libraries with millions of lines of code. And instead of taking many months to create a new model, it may take mere hours or days. You can rapidly develop models, create numerous challenger models to fast track model validation, and receive accurate analytics. Rather than view simulations that look ten days ahead, you can go as far as hundreds of timesteps into the future, depending on your hardware.”