0:00
/
0:00
Transcript

How businesses can differentiate in a generative AI world

Customer experience has been the battleground for brands over the last 5 years. But that's all about to change.

Video Transcript:

(00:00:23):

Most businesses in recent years have differentiated based on customer experience,

(00:00:28):

but generative AI is potentially going to change all of that entirely.

(00:00:33):

If you think about most businesses,

(00:00:35):

most businesses in most sectors are pretty much exactly the same.

(00:00:39):

If you're a bank, all banks do the same stuff.

(00:00:41):

Most car manufacturers are all the same.

(00:00:44):

A lot of the cars are built on the same chassis.

(00:00:46):

They use the same engines.

(00:00:47):

You can't differentiate on the product itself.

(00:00:49):

You can't differentiate on the brand itself anymore unless you've got a

(00:00:52):

particularly strong brand like Apple or Nike or something similar.

(00:00:54):

So it's experience and the experience layer has really been where businesses have

(00:00:58):

been able to create some kind of differentiation.

(00:01:01):

I can't tell you how many banks that I've moved from because of the poor online experience.

(00:01:06):

But with the generative AI technologies that exist today,

(00:01:10):

The value of providing great experiences is still just as valuable as it was before.

(00:01:15):

There are so many terrible IVR systems.

(00:01:18):

There are so many terrible chatbots.

(00:01:21):

But if everybody is using the same models,

(00:01:25):

then does that mean that over time the experience of having these conversations

(00:01:29):

becomes homogenized and becomes so similar that if you're talking to direct line,

(00:01:35):

you may as well be talking to Hastings because they're all using the same

(00:01:38):

fundamental models that are all doing the same kind of stuff.

(00:01:41):

The only difference is really that your answer is going to be slightly different

(00:01:44):

based on the policy.

(00:01:45):

There is a risk here that everybody using all of the same technology creates this homogenous,

(00:01:51):

albeit working,

(00:01:53):

experience that then makes it very difficult to differentiate.

(00:01:58):

So the question is, how do you differentiate

(00:02:01):

in a generative AI era.

(00:02:03):

And one of the ways I think businesses can differentiate is through considering

(00:02:08):

creating their own fine-tuned models for specific capabilities.

(00:02:13):

Large language models are going to become more capable,

(00:02:16):

although potentially there's a plateau that we're going to see there.

(00:02:19):

But needless to say, large language models can do a lot of the tasks that you need them to do.

(00:02:23):

They can vectorize content so that you can use retrieval augmented generation.

(00:02:28):

They can create summaries of multiple documents to answer complex user questions.

(00:02:32):

They can classify intents,

(00:02:35):

for want of a better phrase,

(00:02:36):

to understand what people mean when they say whatever it is.

(00:02:39):

that they say.

(00:02:40):

They can do things like entity extraction to take out important information from a

(00:02:45):

customer utterance.

(00:02:46):

They can do things that NLU systems couldn't do very well, which is process long user inputs.

(00:02:51):

If someone writes you an essay or sends you an email,

(00:02:54):

traditional NLU systems have often struggled to understand what to do with those things.

(00:02:58):

So large language models are incredibly capable and can do a lot of this stuff very well.

(00:03:02):

The problem is, is that they don't do it very fast.

(00:03:06):

So if you wanted to have an experience that runs on the voice channel,

(00:03:08):

for example,

(00:03:09):

either a voice interface that runs in an app or something that you can call and

(00:03:12):

speak to,

(00:03:13):

the inference time of large language models tends to be a little bit too long.

(00:03:17):

And also the cost of the inference on a large language model is typically going to

(00:03:21):

be a lot higher than it would be if you were to use another model of some other description.

(00:03:26):

With a small language model,

(00:03:27):

for example,

(00:03:28):

a fine-tuned small language model,

(00:03:29):

you might find that you can get faster performance.

(00:03:31):

You might find that you can secure it a bit better and run it on your own servers

(00:03:35):

rather than using APIs and pinging it off to some unknown cloud somewhere.

(00:03:39):

You can fine-tune them a lot more easily,

(00:03:41):

which means that you can customize them for very specific use cases.

(00:03:45):

And if you think about customers and users that are going to talk to your business...

(00:03:51):

The fundamental first thing you've got to get right is you have to be able to understand them.

(00:03:56):

I think that where businesses are going to end up,

(00:03:58):

and this is the direction that you should be trying to move in,

(00:04:00):

is you need to have a channel strategy in place,

(00:04:03):

which means that regardless of where customers contact you on,

(00:04:07):

regardless of what channel that is,

(00:04:08):

whether it's through social media,

(00:04:09):

whether it's sending you an email,

(00:04:10):

calling your contact center,

(00:04:12):

or using the live chat on your website,

(00:04:13):

or using your chatbot,

(00:04:15):

or in your mobile app,

(00:04:16):

wherever they go,

(00:04:18):

The first thing that they're going to hit is going to be your AI assistant or AI agent,

(00:04:23):

if you want to give it that term.

(00:04:25):

That AI assistant's first primary job is going to be understanding what it is

(00:04:29):

you're trying to get done and matching that to something that the business can help

(00:04:33):

you with.

(00:04:34):

Often, that's not going to be something that it can help you with directly.

(00:04:39):

If you send a bank a message on Instagram and you're asking it to transfer some

(00:04:43):

money or what have you,

(00:04:44):

it's not going to do that on Instagram messaging because it can't authenticate you

(00:04:48):

and that's just not going to happen.

(00:04:50):

So it might then suggest that actually for this, you may as well use the app.

(00:04:53):

Or if you call your bank and the bank knows that you're registered for mobile

(00:04:56):

banking and it can see that you're logged in on your mobile bank,

(00:04:59):

it might just send you a push notification once it understands your intent.

(00:05:02):

so that you can just get that thing done in the banking app rather than taking time

(00:05:06):

away from people who would otherwise answer the phone and take time away from other

(00:05:10):

people who would need that service more.

(00:05:13):

So one of the ways in which a business can differentiate using AI is to do

(00:05:19):

something like that,

(00:05:20):

which is to customize certain models to be able to provide certain capabilities

(00:05:26):

better than you would find in an off-the-shelf vendor and better than your

(00:05:30):

competitors can do so.

(00:05:32):

Classification is just one example.

(00:05:34):

Triaging is another example.

(00:05:36):

Entity extraction is a marginal gain because ultimately there's going to be lots of

(00:05:40):

models that are going to be able to do that kind of thing.

(00:05:42):

But you can see what I'm getting at.

(00:05:44):

Where is it in your business that you believe you have either intellectual property

(00:05:49):

or the ability to create significant value through having your own fine-tuned model?

(00:05:55):

For me, that's the area of differentiation.

(00:05:58):

It's still at the experience layer,

(00:06:00):

But it's in being smarter and more intelligent than what you're likely to find from

(00:06:06):

all of the rest of the competitors that you've got.

(00:06:08):

Most of your competitors today are going to be using retrieval augmented generation.

(00:06:12):

That's going to be where their use of generative AI starts and stops.

(00:06:16):

And the problem with retrieval augmented generation in a business context is that

(00:06:20):

although it can answer questions,

(00:06:21):

it doesn't have an awareness of the conversation.

(00:06:24):

It doesn't understand whether this question that's come from a customer is a

(00:06:27):

sales-based question,

(00:06:29):

whether it's a support-based question.

(00:06:31):

It doesn't understand what products and services sit underneath that request, really.

(00:06:34):

It's just looking for patterns in a semantic similarity based on content that you've got.

(00:06:39):

So it can't understand where the customer is in their journey.

(00:06:42):

It can't marry together the products and services that you have against that stage

(00:06:46):

of the journey.

(00:06:47):

And therefore, it can't actually be useful.

(00:06:49):

So in a telco example,

(00:06:51):

if I say something like,

(00:06:54):

how do I go about upgrading my account or upgrading my contract?

(00:06:59):

The retrieval augmented generation response to that is going to be,

(00:07:02):

oh,

(00:07:02):

simple,

(00:07:02):

to upgrade your contract,

(00:07:03):

all you need to do is go and pay off your other contract and then find a phone and

(00:07:06):

take out a new contract,

(00:07:07):

full stop.

(00:07:08):

And this is what I call a full stop problem,

(00:07:09):

which is that it doesn't understand that what I'm telling you there is that I'm

(00:07:13):

talking about renewing my contract.

(00:07:15):

That's a revenue-based conversation.

(00:07:17):

What you should be doing instead of ending that response with a full stop is you

(00:07:20):

should be saying,

(00:07:21):

have you found a phone that you like?

(00:07:22):

Do you want to log in and let me look up your account information?

(00:07:24):

Let's see how long you got left on your contract.

(00:07:26):

Let's move the conversation forward.

(00:07:29):

And so these are the areas I think that businesses can differentiate on.

(00:07:32):

One,

(00:07:32):

you can understand customers better potentially by having your own fine-tuned

(00:07:37):

models that understand far more effectively.

(00:07:40):

Second,

(00:07:41):

triaging users and customers to the most appropriate channel where they can get

(00:07:44):

their issues resolved.

(00:07:46):

Third,

(00:07:47):

in conversation,

(00:07:48):

being a lot more sensible and intelligent about having awareness of the customer

(00:07:51):

journey and the real needs they're trying to get solved.

(00:07:54):

And then fourth,

(00:07:55):

crafting really compelling,

(00:07:57):

really natural,

(00:07:58):

engaging,

(00:07:59):

fluid,

(00:08:00):

easy to use,

(00:08:01):

effortless experiences that help customers get stuff done there and then in channel.

(00:08:06):

So it's an interesting time.

(00:08:08):

And it's a time where most businesses are scrambling just to get some generative AI

(00:08:13):

use cases off the ground.

(00:08:15):

That's where you're going to find the homogeny.

(00:08:18):

What you really need to be doing is focusing more on the experience.

Discussion about this video