Chatbots, Disambiguation & IBM Watson Assistant Actions

Ambiguity Should Be Clarified & Not Avoided

Cobus Greyling

--

Introduction

Disambiguation became available on all Watson Assistant plan types on 26 November 2019. By now we all know the prime aim of a chatbot is to act as a conversational interface, simulating the conversations we have as humans.

Unfortunately you will find that many of the basic elements of human conversation are not introduced to most chatbots.

This demo shows how the Action is tested with ambiguous user input and the relevant options are presented back to the user to choose from.

Often throughout a conversation we as humans will invariably and intuitively detect ambiguity.

Ambiguity is when we hear something which is said, which is open for more than one interpretation.

Instead of just going off on a tangent which is not intended by the utterance, we perform the act of disambiguation; by asking a follow-up question. This simply put, is removing ambiguity from a statement or dialog. In order to accurately respond.

Ambiguity makes sentences confusing. For example, “I saw my friend John with binoculars”. Does this mean John was carrying a pair of binoculars? Or, I could only see John by using a pair of binoculars?

Hence, I need to perform disambiguation, and ask for clarification. A chatbot encounters the same issue, where the user’s utterance is ambiguous and instead of the chatbot going off on one assumed intent, it could ask the user to clarify their input. The chatbot can present a few options based on a certain context; this can be used by the user to select and confirm the most appropriate option.

Just to illustrate how effective we as humans are to disambiguate and detect subtle nuances, have a look at the following two sentences:

  • A drop of water on my mobile phone.
  • I drop my mobile phone in the water.

These two sentences have vastly different meanings, and compared to each other there is no real ambiguity, but for a conversational interface this will be hard to detect and separate.

Your Chatbot Must Be Enabled For Disambiguation

Instead of defaulting to an intent with the highest confidence, the chatbot should check the confidence score of the top 3 to 5 matches.

When actions within an Action Skill are listed, Disambiguation is one of the columns. This feature can be toggled on and off for each action. This can be helpful for actions where you require an explicit input from the user.

If these scores are close to each-other, it shows your chatbot is actually of the opinion that not a single intent will address the query. And a selection must be made from a few options.

Here, disambiguation allows the chatbot to request clarification from the user. A list of related options should be pretested to the user, allowing the user to disambiguate the dialog by selecting an option from the list. Hopefully not more that 3 options will be presented.

But, the list presented should be relevant to the context of the utterance; hence only contextual options must be presented.

Disambiguation enables chatbots to request help from the user when more than one dialog node might apply to the user’s query.

Instead of assigning the best guess intent to the user’s input, the chatbot can create a collection of top nodes and present them. In this case the decision when there is ambiguity, is deferred to the user.

Within Action settings, Disambiguation can be set globally.

What is really a win-win situation is when the feedback from the user can be used to improve your NLU model; as this is invaluable training data vetted by the user.

Disambiguation can be triggered when the confidence scores of the runner-up intents, that are detected in the user input, are close in value to the top intent.

Hence there is no clear separation and certainty.

Disambiguation In Actions

Disambiguation occurs when your assistant finds that more than one action can fulfill a customer’s request, and asks the customer for clarification.

Instead of guessing which action to take, your assistant shows a list of the possible actions to the customer, and asks the customer to pick the right one.

This is how to toggle Disambiguation on and off for each Action within an Action Skill. You can see the status change from On to Off.

When the user is presented with disambiguation the name of the disambiguation button is derived from from the name of the action.

In cases where the action name is not specified, the first example utterance added to the action is used as the action name.

In this Actions view, the link between the Action Name and the Disambiguation option name is illustrated.

Within the skill settings, the menu introduction line can be set and the fallback choice label.

Within the skill settings, the menu introduction line can be set and the fallback choice label.

This fallback option gives users an avenue to break out of the disambiguation menu if it is not fulfilling their intent. If disambiguation is not successful, the following can be tried:

  • Check the training phrases of each Action and ensure there is no over lapping.
  • Try to improve the Action Names to be more description and definitive. This will, as shown earlier, be presented as the button names and might clarify the options to the user.

Conclusion

In conclusion, suffice to say that the holy grail of chatbots is to mimic and align with a natural, human-to-human conversation as much as possible. And to add to this, when designing the conversational flow for a chatbot, we often forget about what elements are part and parcel of true human like conversation.

Digression is a big part of human conversation, along with disambiguation of course. Disambiguation negates to some extent the danger of fallback proliferation where the dialog is not really taken forward.

With disambiguation a collection of truly related and contextual options are presented to the user to choose from which is sure to advance the conversation.

And finally, probably the worse thing you can do is present a set of options which is not related to the current context. Or a set of options which is predefined and finite which reoccurs continually.

Contextual awareness is key in all elements of a chatbot.

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

Responses (1)