Learn how Financial Risk organizations can benefit from both BCBS 239 practices and Data Mesh pillars to create consumable and explainable datasets.
For investment organizations, where critical decisions on risk, capital, and trades are made daily, the main data challenges lie data quality, storage, aggregation, and running calculations.
These challenges stem largely from the fact that vast data repositories, like data lakes, are often not practical for business use. Adding to the problem are the high and unpredictable costs of cloud computing and storage, which are essential for handling complex, real-time calculations on massive datasets.
Moreover, the critical processes of backtesting, executing 'what-if' scenarios, and providing intraday refreshes are hampered by these limitations.
The core issue is the increasing demand for data consumption, complicated by technical debt in data systems, widespread data silos, and the difficulty of applying business logic to large, often unstructured data repositories. As a result, accessing, combining, and analysing data in a meaningful and timely way has become increasingly costly and complex.
BCBS 239, introduced after the financial crisis, was designed to strengthen risk management in banking by improving data practices. It emphasized the need for robust data governance, effective data aggregation, and reliable reporting. However, it was developed during the early days of cloud computing and big data.
Data Mesh emerged when big data technologies had matured. It was designed to overcome major bottlenecks caused by the centralized nature of data lakes and IT organizations, which increasingly operated as shared services across all business lines.
BCBS 239 targets risk analytics practices specifically for the regulated banking sector, while Data Mesh offers a broader, more versatile framework. Unlike traditional models that layer responsibilities and architectural components in a project-based approach, Data Mesh emphasizes treating data as a product. It promotes clear data ownership, stronger systemic governance, and removes technological and structural bottlenecks, making it a powerful solution for organizations seeking scalable, collaborative data strategies.
In this article, we explore how Financial Risk organizations can leverage BCBS 239 practices and Data Mesh principles to their advantage. We explain how these frameworks complement each other and how they can inspire a new implementation model for creating risk datasets that are both consumable and explainable.
BCBS 239, the Basel Committee on Banking Supervision's standard number 239, was introduced in response to the 2007-2008 global financial crisis. The crisis highlighted significant shortcomings in banks' information technology and data architecture systems, particularly in risk data aggregation and risk reporting practices. These deficiencies impeded the ability of banks to identify and manage risks effectively.
The main goal of BCBS 239 is to enhance banks' risk data aggregation capabilities and internal risk reporting practices. This standard plays a vital role in strengthening the resilience and risk management of banks, especially globally systemically important banks (G-SIBs). By setting clear principles, BCBS 239 helps these institutions improve their ability to identify, measure, and manage risk with greater accuracy and efficiency.
The principles of BCBS 239 cover four key areas:
BCBS 239 provides clear principles to improve banks' ability to aggregate and report risk data, especially during financial crises. The principles touch on:
Quantitative risk areas like market, credit, counterparty, and liquidity risk, require real-time, scalable and comprehensive data management solutions. Key requirements such as FRTB's Standardized Approach (SA) and Internal Models Approach (IMA), RWA calculations, and liquidity ratios (including NSFR and LCR) highlight the need for detailed data aggregation, simulations, and 'what-if' scenarios.
Data mesh shifts the focus from centralized data lakes or warehouses to a more democratized, product-centric approach. It fosters four foundational pillars:
This approach is particularly important for trade management and execution. For buy-side institutions facing increasing regulatory pressure, the ability to quickly access, process, and analyze data is crucial.
In the evolving landscape of risk treasury and trade execution analytics, there's a growing need to combine the structure of BCBS 239 with the flexibility of Data Mesh principles. This process involves several steps, from initial workshops to the final self-service implementation for end-users.
Step 1: Workshops for Business Requirements and Technical Feasibility
The first step is to hold workshops that focus on understanding business needs and assessing technical feasibility. These sessions allow stakeholders to share their requirements while technical teams identify challenges and solutions. The goal is to align business objectives with technical capabilities.
Step 2: Building a Comprehensive Data Model
Next, a detailed data model is built to cover various data types and sources, reflecting the complexity of risk, treasury, and trade execution. The model must be robust, scalable, and flexible, able to adapt to changing business needs.
Step 3: Creating a Semantic Layer for Enhanced Navigability
A semantic layer is added to make data easier to access and understand. This layer acts as a bridge between the complex data model and end-users, helping them navigate large data sets with ease. It simplifies the user experience by hiding the data complexity.
Step 4: Building a Physical Data Store with Embedded Business Logic and Data Quality
A physical data store is then built, integrating business logic and data quality controls into the data. This ensures the data is stored efficiently, processed and validated for reliability and accuracy. By adding business logic at this stage, it aligns with Data Mesh principles, bringing intelligence closer to the data.
Step 5: Enabling True Self-Service for Data Consumers
The final step is to offer true self-service capabilities to data consumers. This can be achieved in two ways:
By following these steps, organizations can build a strong analytics framework that combines the benefits of BCBS 239 and Data Mesh principles. This hybrid model ensures compliance, scalability, and flexibility, meeting the evolving needs of risk, treasury, and trade execution analytics. It offers a forward-thinking approach to data management, blending regulatory requirements with the agility of modern data architectures.
About the author: Emmanuel Richard is a Data and Analytics expert with over 25 years of experience in the technology industry. His extensive background includes leadership roles at industry giants and startups across the US and Europe.