Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics information and knowledge of Big Data.

cta

Get Started

cloud

Ready to Get Started?

Download sandbox

How can we help you?

closeClose button
November 16, 2016
prev slideNext slide

How FRTB Compliance Will Impact Bank IT..

frtb-iceberg
(Image Credit – Activeviam)

As discussed in the previous blog in this series @ https://hortonworks.com/blog/frtb-fundamental-review-trading-book-changes-banking-risk-management/, the FRTB (Fundamental Review of the Trading Book) compels Banks to create unified teams from various departments – especially Risk, Finance, the Front Office (where trading desks sit) and Technology to address all of the above significant challenges of the regulation.

From a technology capabilities standpoint, the FRTB now presents banks with both a data volume, velocity and analysis challenge. Let us now examine the technology ramifications.

Technology Ramifications around the FRTB… 

The FRTB rules herald a clear shift in how IT architectures work across the Risk area and the Back office in general.

  1. The FRTB calls for a single source of data that pulls data across silos of the front office, trade data repositories, a range of BORT (Book of Record Transaction Systems) etc. With the FRTB, source data needs to be centralized and available in one location where every feeding application can trust it’s quality.
  2. With both the IMA and the SBA in the FRTB, many more detailed & granular data inputs (across desks & departments) need to be fed into the ES (Expected Shortfall) calculations from varying asset classes (Equity, Fixed Income, Forex, Commodities etc) across multiple scenarios. The calculator frameworks developed or enhanced for FRTB will need ready & easy access to realtime data feeds in addition to historical data. At the firm level, the data requirements and the calculation complexity will be even more higher as it needs to include the entire position book.
  3. The various time horizons called out also increase the need to run a full spectrum of analytics across many buckets. The analytics themselves will be more complex than before with multiple teams working on all of these areas. This calls out for standardization of the calculations themselves across the firm.
  4. Banks will have to also provide complete audit trails both for the data and the processes that worked on the data to provide these risk exposures. Data lineage, audit and tagging will be critical.
  5. The number of runs required for regulatory risk exposure calculations will dramatically go up under the new regime. The FRTB requires that each risk class be calculated separately from the whole set. Couple this with increased windows of calculations as discussed  in #3 above- means that more compute processing power and vectorization.
  6. FRTB also implies that from an analytics standpoint, a large number of scenarios on a large volume of data. Most Banks will need to standardize their libraries across the house. If Banks do not look to move to a Big Data Architecture, they will incur tens of millions of dollars in hardware spend.

The FRTB is the most pressing in a long list of Data Challenges facing Banks… 

The FRTB is yet another regulatory mandate that lays bare the data challenges facing every Bank. Current Regulatory Risk Architectures are based on traditional relational databases (RDBMS) architectures with 10’s of feeds from Core Banking Systems, Loan Data, Book Of Record Transaction Systems (BORTS) like Trade & Position Data (e.g. Equities, Fixed Income, Forex, Commodities, Options etc),  Wire Data, Payment Data, Transaction Data etc.

These data feeds are then tactically placed in memory caches or in enterprise data warehouses (EDW). Once the data has been extracted, it is transformed using a series of batch jobs which then prepare the data for Calculator Frameworks to which run the risk models on them.

All of the above applications need access to medium to large amounts of data at the individual transaction Level. The Corporate Finance function within the Bank then makes end of day adjustments to reconcile all of this data up and these adjustments need to be cascaded back to the source systems down to the individual transaction or classes of transaction levels.

These risk applications are then typically deployed on clusters of bare metal servers that are not particularly suited to portability, automated provisioning, patching & management. In short, nothing that can automatically be moved over at a moment’s notice. These applications also work on legacy proprietary technology platforms that do not lend themselves to flexible & a DevOpsy style of development.

Finally, there is always need for statistical frameworks to make adjustments to customer transactions that somehow need to get reflected back in the source systems. All of these frameworks need to have access to and an ability to work with terabtyes (TBs) of data.

Each of above mentioned risk work streams has corresponding data sets, schemas & event flows that they need to work with, with different temporal needs for reporting as some need to be run a few times in a day (e.g. Traded Credit Risk), some daily (e.g. Market Risk) and some end of the week (e.g Enterprise Credit Risk).

One of the chief areas of concern is that the FRTB may require a complete rewrite of analytics libraries. Under the FRTB, front office libraries will need to do Enterprise Risk –  a large number of analytics on a vast amount of data. Front office models cannot make all the assumptions that enterprise risk can to price a portfolio accurately. Front office systems run a limited number of scenarios thus trading off timeliness for accuracy – as opposed to enterprise risk.

Most banks have stringent vetting processes in place and all the rewritten analytic assets will need to be passed through that. Every aspect of the math of the analytics needs to be passed through this vigorous process. All of this will add to compliance costs as vetting process costs typically cost multiples of the rewrite process. The FRTB has put in place stringent model validation standards along with hypothetical portfolios to benchmark these.

The FRTB also requires data lineage and audit capabilities for the data. Banks will need to establish visual representation of the overall process as data flows from the BORT systems to the reporting application. All data assets have to be catalogued and a thorough metadata management process instituted.

What Must Bank IT Do… 

Given all of the above data & application complexity and the need to adopt agile analytical methods  – what is the first step that enterprises must adopt?

We will examine a reference architecture for the FRTB in the next and final post in this series.

Leave a Reply

Your email address will not be published. Required fields are marked *