Skip to main content

Human, informed by Computer, Says Yes

Feb 7 2024 • 3 min read

In the comedy show Little Britain there is a sketch involving a customer service representative sitting behind a computer. Whenever a customer makes a perfectly reasonable request, she taps away at a keyboard and says: 'computer says no'.

The sketch resonated with viewers because it illustrated what can happen if you do not engage and listen to a customer’s request, and hide behind the cues you take from a machine.

In a recent project undertaken with The Foundation, for example, we learned of a call from a distraught customer who couldn’t make her next mortgage payment. ‘Our system will only let me help you when you have already missed three payments, so call us back then’, came the reply from the Call Centre agent. It might as well have been ‘Computer Says No’.

As a result of these all-too-common experiences, we’ve been doing further work on how we can use behavioural science and the latest improvements in generative AI to help create more empathetic conversations between human beings interacting with machines. And we’ve turned the focus of this Behavioural AI programme on call centres.

We started by developing a framework for measuring empathy in conversations between human beings. This was undertaken by CogCo behavioural scientists, who identified measurable behavioural constructs for empathy. They then turned these into a scoring mechanism, using insights from decision analysis, and validated the outputs through a series of tests with real world data.

Using a foundational AI model, we then built a set of tools for recording, transcribing, cleaning and labelling calls between customers and call centre operators. This then enabled our data scientists to analyse how well different calls scored on our empathy scale. This can be done on a previously unimaginable scale - taking the calls from thousands of call operators working over a period of months or even years.

The results from our early work are already striking, showing the patterns in calls which are more (or less) empathetic. In the calls below, for example, the high empathy call (scoring 90 in our Empathy Scale) is characterised by a balanced conversation between caller and agent, which can be contrasted with the low empathy call (scoring 25 on our Empathy Scale) in which one side (the caller) has a complex complaint that is not being fully addressed by the operative.

It is one thing to show what constitutes an empathetic call (or not). But it’s another to be able to generate constructive feedback that helps people to improve over time. And we’re now able to do this too at scale. In the example below, we can see feedback, generated using our Behavioural AI tools, giving direct feedback for the operators in specific high- and low-empathy calls.

These are just a couple of examples of the kind of analysis we can now perform using these techniques. Others include summarising the call intent and categorising it automatically; analysing the call dynamic (e.g. how long the agent and caller spoke for; how many turns were taken); and analysing the relationship between elements of a call and specific outcomes (such as an NPS score).

Our plan is to now embed these insights into human-to-human training programmes and AI-generated feedback tools. So that, in the future, ‘human, informed by computer, says yes’.

If you would like to arrange a meeting to run through some of our findings in more detail, and talk about how you can apply this in your work, feel free to get in touch:

Read recommended blogs

Artificial Intelligence, Chatbots and Behavioural Insights Artificial Intelligence, Chatbots and Behavioural Insights

If you’ve ever tried to contact a bank with a query about an account, a utility provider over how to make a payment, or a retailer about a missing delivery, the chances are that you will have been forced to engage with a chatbot. And the likelihood is that, with a few exceptions, you will...

Jan 12 2024 • 4 min read

Jan 12 2024 • 4 min read

Automatic Enrolment: A New Application in Savings Automatic Enrolment: A New Application in Savings

One of the most cited applications of behavioural science in practice is the powerful effect of automatically enrolling people into schemes like pension plans. With a simple change in the sign-up process, these interventions show that we can move from a world in which the minority of people save for retirement to one in which...

Jan 8 2024 • 3 min read

Jan 8 2024 • 3 min read

New Year, Same Old Predictions New Year, Same Old Predictions

At the beginning of a New Year, financial journalists churn out articles about how the markets are likely to perform over the next 12 months. Their sources are often the forecasts published by industry analysts. According to one report published in December 2023, for example, analysts have on average predicted that the value of the...

Jan 3 2024 • 3 min read

Jan 3 2024 • 3 min read

Interested in working with us?

Get in touch at

Copied to clipboard