Technology heads at financial institutions are looking at how ChatGPT and other language-learning models (LLMs) can deliver productivity gains – and at the hurdles they need to jump before they can deploy such software.
LLMs seized the attention of banks along with the rest of the world when US software company OpenAI released its ChatGPT in November, 2022. The power of this form of artificial intelligence is intuitive, and ChatGPT boasts more than 100 million users (of which only 12 percent are from the US, according to demandsage.com).
Google and others have since released their own LLMs, and Microsoft (which owns OpenAI) is licensing GPT plug-ins via its Azure cloud business for enterprises.
But how can financial institutions actually use this technology?
Many banks and corporations have forbidden their employees to use it, out of fear that they release proprietary or customer information into the public domain – because once you query data into ChatGPT’s online platform, it’s searchable.
- Read more:
- ChatGPT’s fintech ideas | Guillaume Huet | DigFin VOX Ep. 62
- Chinese vendor GienTech enters overseas markets
- PortfoPlus brings ChatGPT to insurance agents
Banks are also wary of LLMs’ tendency to “hallucinate”, ie, invent answers and present them as facts. That makes it dangerous to put in front of customers or regulators, or to rely on for critical decisions.
Last week, three technology officials shared their views of ChatGPT, speaking at an event in Hong Kong hosted by GienTech, a Chinese tech vendor to financial institutions.
Their approach varies, depending on their business needs, and where they stand in terms of their own digitalization.
Livi Bank is one of Hong Kong’s licensed virtual banks. Its CTO, Gary Lam, noted it doesn’t need to undergo digital transformation: it was born virtually, with a cloud-based tech stack. It relies on tactics borrowed from e-commerce companies to acquire customers, including online advertizing and promotions.
On the one hand, it is already steeped in the uses of artificial intelligence. It relies on AI for aspects of customer onboarding, such as facial recognition and fraud detection. On the other hand, Lam says as a virtual institution, livi is even more sensitive to cybersecurity risks.
Therefore generative AI requires at least the same degree of risk management and care.
“Gen AI is a piece of software. I’d apply the same standard data-loss protections as other modules in the stack. We may need additional filters, however, before we’d release ChatGPT messages to our customers.”
That is under exploration, because LLMs can turbocharge productivity in client communications and servicing. But the same goes for internal users, which Lam says may include coders, relationship managers, and risk managers.
The biggest internal use case is using human-language queries to search vast troves of regulatory documents. “We can have a human-like search engine to go through a large amount of material,” he said.
Tencent-owned WeBank is one of the world’s most sophisticated digital banks, with 360 million retail customers after just eight years of operating in mainland China. It relies on its proprietary technology to be able to profitably service customers that average revenues that are too low for a traditional bank to handle. WeBank is the poster child for rapid innovation at scale in consumer banking.
LLMs represent a real change, said Yao Huiya, Shenzhen-based head of fintech innovation. But WeBank is not rushing out a GPT service to interact with customers: that would be too risky, especially for a regulated institution. “By its nature you can’t avoid it talking stupid stuff,” he said.
WeBank is unlikely to use LLMs based on the public internet, given the risks to exposing data and breaching regulations. But it is fine-tuning ones that are smaller and which only access the bank’s own data.
Yao says LLMs may be deployed to improve the productivity of the bank’s customer onboarding and SME lending processes. The model can suggest good times to contact customers about a loan, how to customize an introductory contact, and improve the lending book’s performance, by helping credit officers analyze company data.
Yao doubts LLMs will replace credit teams. “It will put the human in the loop, so they can ask the generative AI questions to make better decisions.”
The impact will be felt in the bank’s tech infrastructure. “The computing power will shift from CPUs to GPUs,” he said, referring to types of processors. “Our architecture will need plugins so we can deploy multiple models and run A/B tests on them.”
This is going to be true for all enterprises, not just digital banks. “This will change the architecture of the entire world,” Yao said. “The impact of LLMs won’t end.”
Hong Kong Jockey Club
The Hong Kong Jockey Club is not a licensed financial institution but it engages in many finance-like activities. It has a monopoly in Hong Kong on horse racing and football betting. Like other organizations, it is going digital, such as by using data for its wagering systems – and like other incumbents, it has its own legacy issues to contend with.
Li Sai-Chin, executive manager for data and analytics solutions, says ChatGPT is forcing the Jockey Club to scramble. “It’s created a step change in thinking about data and analytics.” It’s a useful wake-up call to many executives about the need to embrace digital.
Betting on horses involves a lot of data: people look at information such as spreads and horse and jockey track records before they place their bets. “We expect them to ask more questions,” Li said, which makes something like ChatGPT potentially relevant.
This is a way to help the Jockey Club engage more regularly with its customers. For example, during the season, the club organizes two sessions of horse races each week. The other days, there is no interaction with bettors. Savvy chatbots could allow customers to interact and ask more questions more regularly.
The first step is to encourage people to interact with the club’s data more regularly. Gradually, Li sees the club using sensors within its grounds to provide a real-time query and data experience to users as they wander around, checking out the horses, grabbing a beer, or placing a bet.
“We’re doing a lot of work thinking about the offline-to-online experience,” he said. “If we’re interacting with them in real time as they walk past one area, can we point out the next horse they’d like to see?”