Is ESG the next big data problem?

Environmental, Social and Governance (ESG) concerns are unavoidable in financial services. See how Opensee helps to meet their ESG goals.

by
Christophe Rivoire
June 15, 2021
Share to

Overview:

  • The Fundamental Review of the Trading Book (FRTB) forced banks to rethink their data management frameworks. The lessons lessons learned from FRTB are being applied to ESG.
  • The Task Force on Climate-Related Financial Disclosures (TCFD) is planning to update its guidance to recommend providing forward-looking information. This presents banks with a lot of new and unfamiliar data to handle in the coming years.
  • ESG data is about to become a similar Big Data problem to FRTB. Well-designed data platforms will be key to for banks, by easily allowing access to many users and allowing experimentation with the new datasets.

When it comes to Environmental, Social and Governance (ESG), forward-looking banks aren’t waiting for the stick of regulation to address their Big Data needs.

As ESG makes more headlines, shareholders and the general public are pressuring banks to have a proper strategy and to manage reputational risk. This creates an imperative of responsiveness. The push for greater auditability and transparency means banks are generating huge volumes of additional data, posing significant risks to stretched systems. No bank wants to face a crash caused by running out of data memory, dealing with run-away costs or handling the frustrations of those impacted.

At Opensee we are increasingly getting involved in ESG integration projects, solving data analytics problems across our clients’ departments. This set me thinking of recent parallels where a data explosion forced a system redesign on banks. The Fundamental Review of the Trading Book (FRTB) sprang immediately to mind. It seems the scars of complying with FRTB, which kept many banks as well as Opensee very busy, are still fresh for those who didn’t tackle the data challenges early on.  

FRTB required banks to set aside capital, consider periods of stress going back a decade and assess liquidity by counting transactions. Even though the entire volume was rarely used in the past, the data is familiar and standardised. Still, the banks started rethinking their systems only when the scale became clear. AWS thinks FRTB increased data volume by tenfold; our experience indicates this is a conservative estimate. Then of course there was the ‘auditability’ requirement.

The lesson from FRTB

Perhaps after the FRTB experience banks realised that being late to the game is too costly and they are applying this lesson to ESG.

Voluntary reporting is becoming widespread and the Task Force on Climate-Related Financial Disclosures (TCFD) is planning to update its guidance to recommend providing forward-looking information.  In Europe alone there are plenty of signs there will be a regulatory response in one form or another at some point in the future: the Bank of England announced its 2021 Climate Biennial Exploratory Scenario; France’s Autorité de Contrôle Prudentiel et Résolution (ACPR) had a 2020 Climate Pilot Exercise and the European Banking Authority (EBA) launched a public consultation on Pillar 3 disclosures of ESG risks, including a proposal for a Green Asset ratio.

This presents banks with a lot of new and unfamiliar data to handle in the coming years. Even if banks don’t have to take ESG into account for capital purposes, there is significant momentum driving its Big Data challenges in spite of many uncertainties.

The great ESG data uncertainty

In the early days of FRTB, there was an element of learning during the process. For ESG we are looking at a more extreme case. While ESG doesn’t have the historical market data and transactions dimensions on which FRTB is based, it does have a big data issue in the form of a much longer time horizon and a reliance on an evolving data landscape.

Let’s consider first the practical case of dealing with data vendors.

As opposed to relative consistency of market data, using new third-party data is inevitable for ESG, with vendors using different methodologies for even a simple metric. There are already more than 20 different ways that companies report their Employee Health and Safety metrics.

There will also be data gaps that span ranges of companies and time periods, which may require complementary data from several vendors. The inconsistencies in this non-standardised data, as an ASIFMA poll indicates (copied below), presents a new and serious challenge for most banks.

56% of respondents in the ASIFMA poll say greatest data challenge in ESG is inconsistent data

The Data Tsunami

The second challenge facing banks is to handle the quantum of data. Let’s consider this in the context of the ‘E’ in ESG through the example of a bank’s loan origination team looking at a company’s self-disclosed emission.

This would start by including the direct as well as the indirect (Scope 3) emissions that occur further down the value chain, adding many more companies potentially to the analysis. If this sounds like too much detail, note that supply chain emissions could be 5.5 times as high as direct emissions according to some estimates.

After the supply chain of the corporate parent entity, the team would assess the physical risk of, say, the impact of a hurricane or flood scenario to the company’s manufacturing facilities. This requires a database of facilities with geospatial information.

This example highlights the data needs of the loan origination desk but, of course, risk managers and other parts of the bank will also need to assess and present results in different ways, requiring yet more data. This might include several stress test reports, a ‘portfolio temperature’ to keep track of a science-based target, or detailed exposures in TCFD format. Other teams keep track of ongoing corporate engagements across the portfolio as well as potentially different taxonomies across countries, also with separate data needs.  

As there is no clear view today as to what will need to be done tomorrow, the need for flexibility is obvious. Integration of new datasets, internal and external, will not stop, nor will the requirement to access these datasets easily.

Dealing with ESG’s Big Data Problem

In the case of FRTB regulations, a few brave banks acted early with a data management rethink and solved many other problems in the process. Opensee and others provided the new technologies to enable granular analysis and perform iterations quickly.

ESG is about to become a similar Big Data problem, with the added complexity of uncertainties in models and constantly evolving unfamiliar datasets.

While there is an ongoing discussion on ESG data quality, there is no time to wait for full standardisation as the pressure for integration is strong. Well-designed data platforms give access to many users as easily as possible and allow experimentation with the new datasets. This means many banks will be well-placed for when the regulations do eventually come.

Other articles

Put Opensee to work for your use case.

Get in touch to find out how we can help with your big data challenges.

Get a demo