The Good, the Bad and the Illiquid
By Onur Cetin, Senior Sales Executive (UK)
Liar’s Poker used to be essential reading for aspiring bankers. Among the many scenes, involving cigar smoke filled trading floors of Salomon Brothers and other relics of a reckless, bygone era, there is one I still remember even though I read it close to two decades ago, long before the credit crunch. Two bankers pick a difficult to predict event and simulate the subsequent chain reaction scenarios: what would happen to the stock market, the second order and third order consequences, who were the winners and losers, and so on. It was an inspirational way of looking at the world for me at the time when I read it. It begs the question of whether since those days the banks have evolved to take into account uncertainty systemically rather than as a pure thought experiment. I have good and slightly bad news for you.
Before we get into the answer, first let’s narrow it down. Forget about complex derivatives or securitisations with huge pools of assets, let’s focus on cash flows – what’s coming in, what’s going out – and the liquid instruments sitting safely on the bank’s balance sheet. Regulation in the form of LCR (Liquidity Coverage Ratio) and NSFR (Net Stable Funding Ratio) ensure there are enough of these assets that can be converted immediately to cash to cover outflows. Now let’s consider if this familiar daily business can be done and at what speed when the bank is contending with an ad hoc, granular stress scenario.
Setting aside the debate about correlation between the above ratios and potential redundancy of one or the other, the good news is regulatory frameworks lead to better management of liquidity risk. By now, banks are convinced that beyond any box ticking exercise, this is actually a useful framework for which it is worth having the simulation capacity. There should be many internal listeners for the good stories to tell with such a tool.
The slightly bad news is that making these calculations in a forward looking way is not a trivial process, even for mature systems. While we are not talking about complex predictive modelling, banks are finding that the real challenge is the data integration from a wide range of sources and aggregating it quickly enough to make simulation a practically useful tool. This brings us to the reason for the ‘slightly’ bad news I mentioned earlier: big data tech is evolving as fast as these data challenges, so it is not as daunting or as costly as it seems.
The main integration/aggregation challenge is easy to visualise. IT teams pre-aggregate data i.e. do the sums for you, so you don’t have to process the entire amount of data on the fly. This is what you would expect, since the analysts will look at the same books and sectors and countries everyday and neither will regulatory stress specifications change overnight. This is a good idea until the ad hoc scenario upsets the status quo. It may raise the granularity problem, where you want to run a stress test on a few companies instead of an entire sector. There may be other related issues, such as introducing some new data sources or perhaps changing your data model to fit the circumstance.
Processing all of the data, even joining different databases to make it a happy home for all the different data sources you might need, is one solution to these issues however its simplicity is deceptive. Processing all this in new servers with increased RAM is punitively expensive, even before we take on board the current global chip shortage. Hard disks are much, much cheaper. If you could put the data on a hard disk to inspect and customise your scenarios as needed, provided the system is fast enough, your IT problems will recede and better problems will emerge with the expanded horizons your new tools provide. Immediate accessibility of data also makes real-time monitoring of cash flows possible, so you can get an understanding of what is going on during the day and have the agility to make quick decisions when needed. Disk space is not a problem anymore, so you can store everything and explain your decisions. An audit trail is a natural by-product of this setup. This is exactly how Opensee is powering innovation in liquidity management platforms and delivering it across tier-1 banks.
A recurring theme in Liar’s Poker is the development of corporate structures as banks expand. Probably the thought experiments of the traders were really their way to cling to creativity as systems were enhancing the day-to-day routine. This highlights the area that I love talking about with Opensee clients. Once the granularity problem is solved and a powerful simulation engine is in place, it opens up a whole new dimension of uses and unleashes creativity. There could even be ways to unravel the interactions of different metrics and dynamics throughout the bank, for sharing with others interested in seeing the scenario results. Big Data is challenging, but the ongoing tech breakthroughs give the new generation of bankers the power to harness it for new ideas.