Skip to main content

Artificial Intelligence, Chatbots and Behavioural Insights

Jan 12 2024 • 4 min read

If you’ve ever tried to contact a bank with a query about an account, a utility provider over how to make a payment, or a retailer about a missing delivery, the chances are that you will have been forced to engage with a chatbot. And the likelihood is that, with a few exceptions, you will have found the experience deeply frustrating.

At CogCo, we’ve been reviewing a wide range of chatbots and the tools upon which they are built as part of our ongoing Behavioural AI programme. This has included interacting with chatbots using a number of different scenarios (from the simple to the complex) in order to understand where they perform best, and where the frustrations start to kick in.

Our review so far shows that most existing chatbots, and the tools that power them, still suffer from two technical limitations. The first relates to problem-solving abilities. Chatbots can excel at routine tasks, but they will often struggle with anything unexpected or novel. The second relates to language complexity. Most existing chatbots continue to find it difficult to decipher the intent of a user who is inputting a request using written language.

The example below which we have anonymised from a chatbot conversation with a major high street bank illustrates both of these points. We set the chatbot the relatively straightforward task of helping with a situation in which a customer has mistakenly made a payment from their bank. But the chatbot has misunderstood. It interprets the ‘error’ as coming across an ‘error message’, and rather than seeking clarification, it provides the answer it thinks you wanted. Following clarification from the customer, the chatbot then does understand the challenge the customer faces, but because it is a slightly out of the ordinary request, it is dealt with by passing it on to a ‘colleague’. But this colleague never turns up.

These technical limitations are now being addressed, at least in part, by the increasingly sophisticated large language models underpinning the latest developments in generative AI. These models are vastly better than their predecessors at interpreting text inputs and developing an increasingly sophisticated range of responses. But it will require another step to ensure the more complex situations can actually be resolved in a satisfactory way for the customer.

Where the technology will take longer to develop is in relation to two behavioural factors that will be familiar to designers and behavioural scientists. In particular are the challenges of emotional understanding and user experience. The vast majority of corporate chatbots are not - yet - capable of interpreting the emotional state of the humans they are interacting with. It is currently difficult for them to respond with empathy to users in a hot state (as they will often be); or facing really challenging situations. And very few chatbots achieve the goal of enhancing the user experience by removing unnecessary barriers and making it as easy as possible for the customer to achieve their goal (as opposed to that of the organisation).

We see all of these technical and behavioural limitations at play in the example below. A chatbot initially fails to really understand the base problem (that a customer has lost their job and therefore will find it hard to pay their bill). It eventually works out why this is a challenge for the utility company (unable to pay a bill), but does so in a way that lacks any empathy for the situation that they are facing (they have lost their job, after all). The chatbot goes onto deals with the problem by pushing the customer to a website which they could have found and navigated to anyway, adding frustration and friction to the experience.

We are currently piloting a range of projects that look at the opportunities for incorporating behavioural insights into AI applications - using chatbots, and a wide range of other situations in which large numbers of human beings are engaging with a company’s systems and processes. This will include work on diagnosing the emotional state of an individual, and using this to change the way that chatbots interact with human beings with empathy and understanding. And helping to develop solutions and prompts that are more in line with how we might seek to solve other behavioural challenges - in particular by removing frictions in interactions and solutions being suggested.

If you are interested in investigating any of these challenges, we would be very happy to talk. You might even be able to participate in one of our pilot programmes.

Read recommended blogs

Remembering Michael Mosley Remembering Michael Mosley

We were very sad to hear about the death of Michael Mosley, whom I was fortunate enough to work with back in 2015. We wanted to take the opportunity to pay tribute to Michael, who was a superb science presenter and journalist, and always had a keen interest in how the latest research might translate...

Jun 11 2024 • 3 min read

Jun 11 2024 • 3 min read

What do young behavioural scientists want in a job? What do young behavioural scientists want in a job?

Have you ever wondered what factors are most important to you in a job? Maybe it’s the money? Or the calibre of people you might get to work with? Or perhaps the perks that go along with the role (the travel or the free bar?). Have you also wondered why you have so rarely been...

Apr 17 2024 • 4 min read

Apr 17 2024 • 4 min read

Remembering Daniel Kahneman Remembering Daniel Kahneman

It was with great sadness that we learnt of the news that the towering figure of the behavioural science world, Daniel Kahneman, passed away. We know that there are already many wonderful tributes online that detail the life of the psychologist who famously won the Nobel prize in economics. I was lucky enough to meet...

Mar 28 2024 • 3 min read

Mar 28 2024 • 3 min read

Interested in working with us?

Get in touch at info@cogco.co

Copied to clipboard