Given the phenomenal growth of the financial services industry over the past decade, resulting in a much greater demand on data provisioning, it was imperative that sizeable investments were made in risk data aggregation frameworks to support a bank’s profitable business model. The rewards of such would have been immeasurable during the recent crisis, simply for the fact that a complete view of what risk was being run against each exposure, counterparty, customer, product, instrument, entity etc. could have been established in minutes rather than days. An example was the delay in establishing what a bank’s total group exposure and risk was to Lehman’s at the height of the crisis. Not having this information at hand may inevitably have led to some sub-optimal decision making on how best to weather the impending storm and the fallout shortly afterwards.
As everyone is probably aware, the Basel Committee, as part of its detailed ongoing review of what went wrong during the global financial crisis and what should be done to negate the impact of a future crisis, released BCBS 239 “Effective Risk Data Aggregation & Risk Reporting” back in Jan 2013. It was no surprise that the primary focus was on how overall, banks were less than agile in getting a complete, granular, transparent aggregated view of the risks they faced.
Whether this was due to a less than robust risk data governance infrastructure, limited capabilities for risk data aggregation or opaque reporting practices, it is clear that considerable resources and efforts are required by banks to ensure they are fully compliant with all 11 principles of BCBS 239 by January 1st, 2016.