Swift, Ubin projects show DLT’s promise and pitfalls
Major blockchain initiatives prove the technology offers privacy and scalability, but banks waiting to see which models prevail.
Blockchain projects by major financial sponsors notched successes in 2017 at the proof-of-concept level: Swift, the Monetary Authority of Singapore, and others have concluded distributed-ledger technology (DLT) can protect privacy and dramatically cut operational costs of payments. But as different versions emerge, so are their pros and cons.
There will be room for many DLT approaches, but for any of them to flourish, individual banks need to think about when to graft them onto legacy systems. It’s not clear that 2018 will be that year – if anything, these PoCs show that the tipping point when financial institutions decide to implement DLT solutions is a ways off.
But their progress will force banks to pay more attention, because at some point they’ll also have to decide when to connect to their utilities via blockchain.
“This makes blockchain real for banks,” a vice chairman at a global bank told DigFin when asked about completed PoCs for Project Ubin, the Monetary Authority of Singapore’s bid to decentralize interbank payments. “It shows blockchain can be permissioned. Banks will be forced to come up with entire business models for operating on this basis.”
But he says that Ubin’s challenge will be scale: not just from a technical point of view, but from winning overall industry buy-in. That in turn depends on how quickly the industry arrives at standards for DLT, which remains the focus of Swift and DTCC.
“It will take about four years before we arrive at a standard,” said Jan Noppen, head of standards tools and methodology at Swift, which concluded two PoCs this year. “That timeframe coincides with the maturation of the technology.”
One DLT or many?
But financial utilities such as Swift and Depository Trust & Clearing Corporation (DTCC), as well as government-led initiatives such as Singapore’s Project Ubin, are working to develop standards to allow blockchains to communicate, and protocols to which the vendors are likely to conform.
Only Australia Stock Exchange’s decision to replace its registrar, clearance and settlement systems with a blockchain built by Digital Assets Holdings sees a major institution taking a stand on a DLT provider. One reason: the move doesn’t force exchange participants to engage with ASX’s distributed ledger. They can opt to continue business as usual so long as they switch to a new global ISO messaging standard. While ASX hopes to kickstart broader use of DLT, it’s unclear how many member brokers will make the jump when ASX goes live on its new platform in March.
The others are still either experimenting, as is the case with Swift and Project Ubin, or have decided to use multiple DLTs for specific use cases, as DTCC has done. These are all utilities or multilateral groups; for DLT to work, it’s not just a question of whether the technology can prove itself, but how willing member banks will be to move their own systems into the decentralized world and engage with the infrastructure in a new way.
While DTCC is using vendor Axioni for a full-throated replacement of warehousing credit derivatives, because of the technology’s emphasis on integration with legacy systems, it is working with Digital Assets Holdings to reinvent part of its repo settlement systems.
Project Ubin, backed by 11 banks and four tech companies, has established PoCs regarding how to conduct multilateral netting while preserving privacy among multiple participants, with prototypes developed on Corda, Fabric and Quorum.
Swift, along with 33 member banks, conducted a PoC to see if banks can reconcile nostro accounts in real time on Fabric, but it has also been analyzing other vendors to see if they turn out to be the better choice.
Based on Project Ubin’s completed second phase and Swift’s experience, here are the pros and cons of three prototypes – the nuances of which could determine how much support they ultimately receive from banks.
- easier to scale than other prototypes
- integrating banks as nodes is straightforward
- privacy is built into the architecture, with banks permissioned on a need-to-know basis; transaction identities are not broadcast across the network
- speed and finality assisted by the creation of notary nodes
- transaction amounts are publicly known, therefore anonymity requires many players on the network
- Corda’s architecture leads to ever-extending lines of code, and therefore long and heavy lineage chains with long transaction histories – so it’s slow
- Notary nodes represent single points of failure; when a node fails, the transaction information is preserved but the transaction can’t be completed
Hyperledger Fabric (IBM)
- speed is good: Fabric allows ‘channels’, bilateral exchanges that are private and have their own ledgers, which are used to transfer funds and saving the blockchain for netting among many participants
- efficiency gains: Fabric deploys an ‘orderer’ node to validate transactions
- no interoperability among channels so they link to the blockchain through conventional software
- banks must maintain minimum funding levels per channel
- regulator (MAS) required as overseer to step in with liquidity if one bank is temporarily unable to fund a liability – so there is a need for a centralized authority (although some might regard this as a ‘pro’)
- orderer nodes introduce the risk of single points of failure
- chaincode uses low-level programming languages, so will need upgrading to reach future standards for DLT
Quorum (J.P. Morgan)
- resiliency: Quorum broadcasts transaction information across the network while maintaining detailed instructions private among participants – so banks can validate transactions without having to trust their counterparty (the tool to shield banks’ private information is using zero-knowledge proofs to validate trades)
- individual node failures do not disrupt generation of new blocks and information is preserved when nodes come back online
- use of zero-knowledge proofs still slow and consumes computing power
- a failed zero-knowledge proof disrupts netting, creating risks that slow processing speeds lead to nodes being switched off when the network is trying to resolve liquidity gridlocks