Skip to main content

Artificial Intelligence, Chatbots and Behavioural Insights

Jan 12 2024 • 4 min read

If you’ve ever tried to contact a bank with a query about an account, a utility provider over how to make a payment, or a retailer about a missing delivery, the chances are that you will have been forced to engage with a chatbot. And the likelihood is that, with a few exceptions, you will have found the experience deeply frustrating.

At CogCo, we’ve been reviewing a wide range of chatbots and the tools upon which they are built as part of our ongoing Behavioural AI programme. This has included interacting with chatbots using a number of different scenarios (from the simple to the complex) in order to understand where they perform best, and where the frustrations start to kick in.

Our review so far shows that most existing chatbots, and the tools that power them, still suffer from two technical limitations. The first relates to problem-solving abilities. Chatbots can excel at routine tasks, but they will often struggle with anything unexpected or novel. The second relates to language complexity. Most existing chatbots continue to find it difficult to decipher the intent of a user who is inputting a request using written language.

The example below which we have anonymised from a chatbot conversation with a major high street bank illustrates both of these points. We set the chatbot the relatively straightforward task of helping with a situation in which a customer has mistakenly made a payment from their bank. But the chatbot has misunderstood. It interprets the ‘error’ as coming across an ‘error message’, and rather than seeking clarification, it provides the answer it thinks you wanted. Following clarification from the customer, the chatbot then does understand the challenge the customer faces, but because it is a slightly out of the ordinary request, it is dealt with by passing it on to a ‘colleague’. But this colleague never turns up.

These technical limitations are now being addressed, at least in part, by the increasingly sophisticated large language models underpinning the latest developments in generative AI. These models are vastly better than their predecessors at interpreting text inputs and developing an increasingly sophisticated range of responses. But it will require another step to ensure the more complex situations can actually be resolved in a satisfactory way for the customer.

Where the technology will take longer to develop is in relation to two behavioural factors that will be familiar to designers and behavioural scientists. In particular are the challenges of emotional understanding and user experience. The vast majority of corporate chatbots are not - yet - capable of interpreting the emotional state of the humans they are interacting with. It is currently difficult for them to respond with empathy to users in a hot state (as they will often be); or facing really challenging situations. And very few chatbots achieve the goal of enhancing the user experience by removing unnecessary barriers and making it as easy as possible for the customer to achieve their goal (as opposed to that of the organisation).

We see all of these technical and behavioural limitations at play in the example below. A chatbot initially fails to really understand the base problem (that a customer has lost their job and therefore will find it hard to pay their bill). It eventually works out why this is a challenge for the utility company (unable to pay a bill), but does so in a way that lacks any empathy for the situation that they are facing (they have lost their job, after all). The chatbot goes onto deals with the problem by pushing the customer to a website which they could have found and navigated to anyway, adding frustration and friction to the experience.

We are currently piloting a range of projects that look at the opportunities for incorporating behavioural insights into AI applications - using chatbots, and a wide range of other situations in which large numbers of human beings are engaging with a company’s systems and processes. This will include work on diagnosing the emotional state of an individual, and using this to change the way that chatbots interact with human beings with empathy and understanding. And helping to develop solutions and prompts that are more in line with how we might seek to solve other behavioural challenges - in particular by removing frictions in interactions and solutions being suggested.

If you are interested in investigating any of these challenges, we would be very happy to talk. You might even be able to participate in one of our pilot programmes.

Read recommended blogs

Human, informed by Computer, Says Yes Human, informed by Computer, Says Yes

In the comedy show Little Britain there is a sketch involving a customer service representative sitting behind a computer. Whenever a customer makes a perfectly reasonable request, she taps away at a keyboard and says: 'computer says no'. The sketch resonated with viewers because it illustrated what can happen if you do not engage and...

Feb 7 2024 • 3 min read

Feb 7 2024 • 3 min read

Automatic Enrolment: A New Application in Savings Automatic Enrolment: A New Application in Savings

One of the most cited applications of behavioural science in practice is the powerful effect of automatically enrolling people into schemes like pension plans. With a simple change in the sign-up process, these interventions show that we can move from a world in which the minority of people save for retirement to one in which...

Jan 8 2024 • 3 min read

Jan 8 2024 • 3 min read

New Year, Same Old Predictions New Year, Same Old Predictions

At the beginning of a New Year, financial journalists churn out articles about how the markets are likely to perform over the next 12 months. Their sources are often the forecasts published by industry analysts. According to one report published in December 2023, for example, analysts have on average predicted that the value of the...

Jan 3 2024 • 3 min read

Jan 3 2024 • 3 min read

Interested in working with us?

Get in touch at info@cogco.co

Copied to clipboard