In the maze of Financial Services, Model Governance is the missing link

Categories Analytics, Big Data, Blogs, model governance, Risk Management, Technology

Predictive Models are the lifeline of Financial Services organization. On the business side, the ability to accurately predict default is a compelling competitive advantage. Moreover, banks operate on a delicate balance of and interplay of assets and liabilities and need to be on top of the portfolio at all times. Even a very slight deterioration beyond the expected level can be catastrophic. As events have shown, a herd mentality takes over at the first hint of trouble and develops into a firestorm that engulfs the entire organization.

Alarm - Model Governance
Image courtesy of mapichai at FreeDigitalPhotos.net

Things are equally critical on the regulatory side. In the wake of Basel rules and more recently, the Dodd Frank legislation, banks have to be supremely vigilant about their books and have to ensure that their business practices are completely above board. A critical aspect of compliance is the quality of the models being deployed, the related assumptions and the governance process around these models.

Let us look at a typical modelling project life-cycle. When the project was proposed, the modelling team understands the requirement, and why a model has to be built to solve the existing business problem. Now we will move on to the analyst who is going to work on the project. If the analyst is lucky, the data required for the model would be available from a single data source and in appropriate format. If not, he has to work with different teams to get the correct data and perform data cleaning process to just get the dataset ready.

Then comes the usual model building process – Understand the variable distribution, perform data treatment, analyse the key drivers and start the model development. After N iterations, analyst would be in a position to say – “Yes. This is the best model, given the data availability and predictive power of the variables.” Then will move on to share the model results and document the same.

But does it actually stop there? No. The model has to be validated – Not the in sample / out of sample validation, which the analyst would have completed. The model has to be validated by the actual users of the model. They have to understand the risk associated with the model implementation. And there is no guarantee whether the supposedly “best” model, will continue to work for coming months/years. If the model is not assessed for its risk periodically, and the business had blindly implemented the model, sometimes it might lead to disastrous results.

Frustrations with Model Governance.
Image courtesy of David Castillo Dominici at FreeDigitalPhotos.net

The following examples come to mind:

  • In 1997, after a heavy loss of USD 83 Million, the bank of Tokyo-Mitsubishi realized that they had used a wrong model to trade swaptions. The Black-Derman-Toy model, which was initially designed for the purpose of calibrating to market at-the-money swaptions prices, is found to have been too simple to price out-of-the-money swaptions and Bermuda swaptions, which require multi-factor models.
  • In 1982 the Vancouver stock exchange established a new index initialized at the level of 1000. Twenty-two months later the index was constantly decreasing to about 520 even though the exchange was setting records in value and volume. A team of investigators found out that the index which was updated after every transaction just dropped the digits after the third digit instead of rounding. The “true” (rounded) value would have been 1098.892.
  • The financial institution UBS took advantage of a benefit in British tax laws to have lower dividend tax credit. When the British government in 1997 just changed the tax laws UBS suffered huge losses. UBS was not the only bank that was taken by surprise with this change but the one that suffered the most. A possible change in this law was just neglected in the models.

So, the importance of Model Governance, obviously cannot be overstated. Therefore, it requires a rigor that is on par with the building of the model itself. However, in practice, this is easier said than done. In practice, model governance is viewed as a necessary evil related to compliance with regulatory requirements rather than as a value-added activity.

Understanding the value of Model Governance and its role in effective risk management provides:

  1. Proactive ability to assess effectiveness of models – accuracy & discriminating capability.
  2. Objective, data driven , documented & independent process of assessment.
  3. Structured process to review, refine & re build models in time, to limit adverse impact.
  4. Iterative & Ongoing approach – Refine tools & process to address evolving needs.

This puts a huge burden on the analyst, since he has to continuously provide support to implementation team on whether the model’s predictive power is still top notch or not. If not, it burdens the implementation team – They will be on continuous pressure and have bear the risk of model. This is where the governance comes in to picture. There should be a defined structure which ensures that model risk is assessed to implant greater confidence in the reliability of model output.

Every organization should enforce the vigilant governance process, to ensure only the models with minimum risk are implemented. In fact this should be deep rooted in management’s strategic thinking and in institution’s culture to enhance a firm’s overall risk management effectiveness.

At BRIDGEi2i, we have an automated, single model tracking platform, with consistent approach, to proactively identify and react to potential issues. This helps organization in:

  • Enabling a well-defined model governance process that addresses regulatory needs
  • Single repository for all models: auditable & transparent.
  • Visibility to leadership team – objective, consistent assessment of model risk.
  • Proactively identifies performance issues & drill down to key drivers.
  • Automated to reduce time & effort, to deploy & track models (any frequency).

This blog is written by Siva Kumar, Analytics Consultant at BRIDGEi2i

About BRIDGEi2i: BRIDGEi2i provides Business Analytics Solutions to enterprises globally, enabling them to achieve accelerated business impact harnessing the power of data. Our analytics services and technology solutions enable business managers to consume more meaningful information from big data, generate actionable insights from complex business problems and make data driven decisions across pan-enterprise processes to create sustainable business impact. To know more visit www.bridgei2i.com

Connect with us:
facebook BRIDGEi2i on twitter BRIDGEi2i on LinkedIn BRIDGEi2i on Google+ BRIDGEi2i on YouTube

 

The views and opinions expressed in this article are those of the author and do not necessarily reflect the official position or viewpoint of BRIDGEi2i.

BRIDGEi2i
Connect

BRIDGEi2i

BRIDGEi2i is a trusted analytics solution partner to enterprises globally for enabling data-driven transformation. BRIDGEi2i enables better decision-making for business executives across marketing, sales, supply chain, HR, and risk functions.
BRIDGEi2i
Connect
BRIDGEi2i is a trusted analytics solution partner to enterprises globally for enabling data-driven transformation. BRIDGEi2i enables better decision-making for business executives across marketing, sales, supply chain, HR, and risk functions.