top of page
clairevsing

Why Sharp-Sighted Banks Have Moved to Real-Time Risk Management


claire vorster writer

Consider this scenario: Your bank’s portfolio holds a mismatch of assets, it lacks High-Quality Liquid Assets (HQLA), while operational processes are inefficient. When do the real costs of operating this way become apparent? How will you protect yourself from the unexpected and adverse, and make better decisions than your competitors? While risk teams are typically seen as the first line of defense, Treasury and Operations are often the frontline observers of irregular or unusual behavior that can signify underlying problems. However, the effectiveness of this early detection relies heavily on accurate, data-led decision-making. The conventional approach of pulling data to identify anomalies is reactive and prone to oversight. Instead, a proactive approach is needed, where the system itself, equipped with clever analytics tools like Realiti, pushes relevant data and flags anomalies based on trends and comprehensive datasets.

Mark Crowhurst has deep experience in treasury and investment banking in the US and Europe. Now at Baringa, Crowhurst specialises in risk management and optimising liquidity, working with over 100 clients worldwide, ranging from global banks to payment startups. 

Crowhurst and Planixs liquidity expert, Nick Applebee are seeing many firms moving from pull to push models to enhance risk detection. They discussed how real-time monitoring and predictive analytics enable proactive risk management to address emerging threats. Crowhurst says,

“Holding capital, it helps, obviously… But you know, you’re putting a sticking plaster on a wound. Surely the idea is to prevent that wound from occurring in the first place.”

If data quality is not good, you’re…?

In batch process-oriented banking, risk analyses are only available at best one or two days after the fact. In most cases, risk professionals have to make do with data that is days, weeks, or even months out of date. Processing is therefore inaccurate, and important risk-related costs, such as liquidity costs, are not fully considered. A slew of bank failures and near misses have taught us that what we don’t know can cause enormous pain. This makes the urgent case for real-time risk management. In addition, if you don’t know what you have, where it is, and if you can use it, you won’t know how much value is being left on the table. And someone is bound to ask.

“It was a completely different world then where if you hold enough high-quality assets for 30 days, you’d protect your customers in time of stress. Things have changed.” —Mark Crowhurst, Liquidity Talks, 20204

A glaring issue, mandated by Basel, is the need to demonstrate data timeliness, accuracy, consistency, and completeness. Your regulator can request a fresh, more accurate report if data indications drop below a predetermined threshold. Regulatory authorities can demand updated and more precise reports from a financial institution if certain data metrics or performance indicators fall below a specific threshold.


For these reports to clarify exposure estimates, you must include counter-party and deal data. You need statistical techniques and relationship visualisations so that non-statisticians and non-IT specialists can understand the risk. And your problem may not be with your risk model. When data relationships are not clearly understood, wrong decisions happen.


Self-interest makes banks sharp-sighted

There has been cumulative pressure on banks stemming from regulator demands, the added stress of BCBS 239 requirements, data complexity, market volatility, margin constraints, and interconnectivity. This pressure drives many to adopt a sharp focus on risk management, primarily motivated by self-interest. Crowhurst says,


“Your first round of ammunition is your intraday movements. So what’s happening intraday? Because things do happen 24/7. Having a good set of monitoring and early warning tools around that is your canary in the coal mine.”

Actionable, accurate risk management

Real-time technology in risk management does not simply move your bank from batch processing to transaction-focused, continuous processing. Instead, it transforms your processes enabling faster, more accurate decisions, and you gain:


Data visualisation   High-performance reporting makes risk-related data easy to understand for stakeholders. This way, they can make visible, important connections between millions or even billions of individual bits of data.

– Simulations   To help you to forecast, describe and visualise the likely effects of a decision – for instance, how the conclusion of a hedging transaction, directly after the deal is closed, affects the risk profile of the overall portfolio. Banks that do this faster than their competitors, have a clear competitive advantage.


“The ability to make use of timely and accurate data sets can support treasurers when managing their cash flow forecasting, payments and collections. Data can also be leveraged for risk management purposes, as it makes it easier for treasurers to track positions like currency or interest rate exposures.” HSBC, Top five trends shaping the transformation of treasury, 2023

– Stress Testing    What-if scenarios for as-yet incomplete transactions are now possible. To account for the uncertainty of future value fluctuations of transactions and predict cash flow changes, a bank must be able to run through possible scenarios quickly. Equally important, the results of analyses are available rapidly – in time to influence the daily decisions of risk managers.


– Revenue opportunities   By capitalising on market fluctuations and improving customer pricing, banks can drive growth and profitability.


When banks come to us, we talk about ‘getting the basics right.’ A growing number of bank departments are concluding that they require real-time data to do their jobs more effectively, including those in treasury, front office, risk, and operations.


There’s no need to be satisfied with approximations and workarounds anymore.Realiti’s clever software can be seamlessly implemented with minimal intrusion into a firm’s infrastructure. The move has completely changed how many firms see and use data, resulting in millions in cost reductions and revenue generation.

コメント


bottom of page