Opinion

As banks digitise, are they sacrificing human connection?

Bradley Elliott|Published

In a world where technology is evolving at breakneck speed, the need for human empathy in banking remains unchanged. The writer says financial institutions must adapt to foster understanding and trust, proving that when it truly matters, the human touch is irreplaceable.

Image: File

A chatbot just told a grieving widow to check the FAQ. Somewhere, a product team celebrated the deflection rate..

That sentence should be uncomfortable. It is also, for too many institutions, entirely plausible. The banking industry has spent the last decade in a race to digitise, to streamline, to automate, to remove friction from the customer journey. In many respects, it has succeeded. Account opening times have collapsed. Payments are instant. Mobile experiences that would have seemed extraordinary ten years ago are now table stakes.

But somewhere in the drive for efficiency, something got confused. Removing friction from a balance enquiry is not the same as removing a human from a mortgage conversation. Automating a payment confirmation is not the same as automating the response to a customer who hasn’t slept in three nights because their business is failing.

Banks have digitised the transactional. The problem is they’ve started digitising the relational too. And customers, in their most vulnerable, most consequential moments, are the ones absorbing the cost.

Digitisation became an end in itself

The misdiagnosis runs deep. “Digital transformation” arrived in boardrooms as a strategic imperative and was frequently translated into a single objective: move everything online and take humans out of the loop wherever possible.

This was dressed up as innovation. In most cases, it was cost reduction with better branding, with the goal of increasing margins in an increasingly competitive world.

The distinction that went unmade, and that the industry is only now beginning to reckon with, is between two entirely different kinds of customer interaction. There are transactional moments: checking a balance, transferring money, updating an address. These should be digital, fast, and frictionless. Customers do not want to call anyone for these. They never did.

Then there are relational moments: applying for a mortgage on a house that stretches every financial limit. Asking for support when a business faces collapse. Navigating bereavement, divorce, and redundancy, the financial consequences of life falling apart.

These are not service interactions, they are moments where the quality of human engagement determines not just customer satisfaction but customer outcomes, and they have been handed to the same systems designed to deflect a balance enquiry.

What has been quietly lost in this process is harder to quantify but not hard to name: the branch manager who knew which customers were likely to struggle before they asked for help. The relationship adviser who called before a missed payment became a crisis.

The banker who understood that behind the account number was a person making the most consequential financial decision of their life, and who treated it accordingly. These were not inefficiencies; they were part of the core value proposition of financial service businesses: trust.

The wrong question has been running the industry

Most banks are asking, what can we automate? The question that should be driving strategy is the inverse: what should we never automate?

The default model treats automation as the destination and human involvement as the exception, a concession to complexity or regulation. The right model inverts this entirely. It starts by mapping the emotional stakes of every customer interaction, identifying the moments where human judgment is not just preferable but irreplaceable, and building technology that serves those moments rather than circumventing them.

This is not an argument against AI. It is an argument for deploying it with more intelligence than the industry currently demonstrates. AI is genuinely exceptional at the transactional layer: processing applications at speed, personalising product recommendations at scale, detecting unusual account behaviour before a customer notices it themselves. These are real capabilities that create real value.

None of them requires a human.

But AI deployed at the relational layer, as the first and often only point of contact in moments of financial difficulty or significant life decision, is not a capability. It is an abdication. The distinction matters enormously, and conflating the two is costing the industry more than it realises in trust, in loyalty, and increasingly in regulatory standing.

AI should handle the transactional so humans can own the relational, not replace the relational entirely.

The emerging model, what some in the industry are beginning to call cognitive banking, points toward something more sophisticated. AI that reads context. Systems that detect distress signals in customer language or behaviour and route to a human before the customer has to ask three times.

Intelligence tools that give a relationship manager full customer history and context in seconds, so that when the human call happens, it is not a fresh start but a continuation. Empathy at scale is not a contradiction, but it requires designing AI to escalate intelligently, not to contain and deflect.

 What the industry is telling us

The conversation at senior levels has shifted. The debate about whether to modernise is over. The question now being asked at the boardroom level is not whether to deploy AI, but whether it is delivering trust and measurable customer outcomes, or just operational efficiency metrics.

Institutions getting this right are not those with the most advanced technology. They are those who have been clearest about where technology serves the customer relationship and where it undermines it. 

The FSCA's Treating Customers Fairly (TCF) framework has fundamentally changed the terms of the conversation, and banks that treat TCF as a tick-box exercise are misreading both the letter and intent of the standard. TCF Outcome 6 is explicit: customers must not face unreasonable barriers when they need support.

A chatbot loop that cannot escalate, or a digital journey that has no human exit, is not a defensible response to a customer flagging financial difficulty.

South Africa's model places conduct regulation at the centre of how banks are expected to operate. For a market where financial inclusion remains an urgent national priority, the stakes of getting this wrong extend beyond individual customers.

A first-time banking customer who cannot navigate a digital-only service does not just have a poor experience. They disengage from formal financial services entirely. The regulator is watching, and the standard being set is one of genuine care, not procedural compliance.

The evidence from customer behaviour reinforces what should already be obvious: customers who remember their bank most positively are almost never those who had the fastest app or the most seamless digital journey. They are the ones who received a phone call they didn’t expect. Who spoke to someone who had read their file before picking up the phone. Who felt, in a moment of financial stress, that the institution on the other side of the account understood what was at stake.

 

What needs to change

 

  • High-transactional, low-consequence interactions should be fully automated. High-consequence interactions, regardless of how transactional they appear, require a human pathway that is fast, accessible, and not buried behind three layers of chatbot.

 

  • AI should detect and escalate, not contain and deflect. Distress signals exist in language, in call frequency, in unusual account behaviour, in the pattern of a customer contacting the bank repeatedly on the same issue. AI is capable of reading all of these signals. The question is whether institutions are deploying it to route customers toward care or away from cost. 

 

  • The relationship manager with an AI tool that delivers full customer context, flags potential vulnerabilities, and surfaces relevant products before a conversation begins is not less human than their predecessor. They are more effective. Intelligence-augmented colleagues are not a concession to technology; they are the model that makes empathy at scale possible.

The banks that earn loyalty in the next decade won’t be the ones with the best app. They’ll be the ones that know when to put down the technology and pick up the phone.

The industry has demonstrated, convincingly, that it can build exceptional digital experiences. The harder challenge, the one that will separate the institutions that endure from those that are simply efficient, is building the wisdom to know where those experiences end and where genuine human care must begin.

Technology has advanced. People haven’t changed. They still want to feel that, when it matters, there is someone on the other side who understands what is at stake. That is not a legacy expectation. It is the irreducible standard of what it means to be trusted with someone’s financial life.

 

* Bradley Elliott is CEO of Anti-Money Laundering (AML) platform RelyComply.

** The views expressed do not necessarily reflect the views of IOL, Independent Media or the Independent on Saturday.