How do banks adopt cloud?
Robert Palatnick, DTCC’s tech leader, explains how financial institutions are implementing cloud processing.
Large financial institutions have been slow to move data storage and computing to cloud vendors. They are liable for sensitive customer or proprietary data. Regulations around the treatment and localization of data vary.
This is changing. In Asia the most recent example is Standard Chartered. In August, it announced a five-year migration to move much of its core banking processes to cloud in partnership with Microsoft. Microsoft’s Azure will be its primary vendor in a multi-cloud vendor approach. Other banks such as HSBC have already made similar moves.
This month, the Depository Trust and Clearing Corporation – the clearing and settlement agent for U.S. capital markets – has released a white paper helping guide banks on a similar journey.
Robert Palatnick, managing director and global head of technology research and innovation at DTCC, says the industry has enthusiastically adopted cloud over the past two years, moving data and applications from on-premise servers to vendors such as Microsoft Azure, AWS and Google.
But this shift is not like flipping a switch, and DTCC has assembled some of its learnings.
“You don’t just step into a gym for the first time and gain a six-pack,” Palatnick said. Realizing the benefits of cloud computing, such as scale and resilience, is a process. “You need to put in the engineering time.”
Data storage is very low cost. But running analytics on data in the cloud is expensive: that’s when vendors typically start their meters. Banks are figuring out what analytics make the most sense when they’re paying by the query.
- Read more:
- DTCC looks to extend processing to digital assets
- HSBC goes cloud-first
- MSCI uploads its index business to the cloud
Grappling with such questions has led institutions to also appreciate the value of their on-prem networks and data centers. “Using the cloud also shows the value of a fit-for-purpose server,” Palatnick said.
In DTCC’s experience, it will retain core transaction processing on-prem, because of the strict regulations it faces around business continuity. For example, should a server fail, DTCC is obliged to fully restore functionality within two hours for U.S. clearing and settlement systems, with no loss of data.
That’s a high bar, too high to entrust to vendors.
DTCC has also been piloting a variety of blockchain-based programs. Is there overlap?
Palatnick says yes, although blockchain services remain embryonic, so it’s still more theoretical. But the vision of blockchain is to provide superior resiliency and security by distributing data and processing across nodes, allowing full data portability. It would render multi-cloud vendor solutions obsolete.
“There is resilience in the blockchain model in its fully aspirational realization,” Palatnick said. “But we are early in that journey.”
The DTCC, in a white paper on cloud technology, has outlined best practices for financial institutions, including regulatory obligations, foundational technology requirements, resilience factors, and cloud vendor obligations.
The firm also identified several areas it will continue to explore as it deepens its use of cloud tech, including:
- Applications developed using “cloud-ready” technologies and approaches;
- Automating and layering of technologies to best leverage cloud-hosted apps;
- Securing cloud processing through alerts and monitoring;
- Establishing a principles-based, harmonized regulatory framework;
- Building an industry-wide lexicon of cloud terminology.