Photo by Hu Chen on Unsplash

Chatbot Development Frameworks & Message Abstraction

And How It Assists With Experience Management

Cobus Greyling
5 min readMay 27, 2022

--

Introduction

In Conversational AI Design and Development to craft compelling conversations, different conversational elements need to be presented to the designer or developer.

These elements are abstractions of human-to-human conversation. You can think of the API / integration as an agent doing a lookup on a system to serve a customer better. Or initial intent detection can be analogous to a reception desk where the first act of the receptionist is to understand the intention of the visitor.

Traditionally the elements abstracted and surfaced to the user are primarily NLU (intents & entities), dialog management and integration points. Integration points can be divided into back-end lookup and medium integration. These can be CRM systems, billing platforms, ticketing, etc.

Medium or user interface or access, is the second consideration for integration. This is where the bot is surfaced, be-it Messenger, WhatsApp, SMS or other mediums.

Different elements of the conversation are extracted and presented to the developer or designer. The user can then tweak these different elements and subsequently orchestrate the conversation. Above are the elements constituting Amazon Lex.

As seen above, different elements of the conversation is extracted and presented to the developer or designer. The user can then tweak these different elements and subsequently orchestrate the conversation. Above are the elements constituting Amazon Lex. It is glaringly obvious that dialog management and development is absent. Read more about that here and about possible remedies here.

Chatbot Scripts & User Messages

The script of a chatbot usually lives in spreadsheets and are managed from there by copying and pasting it into a design tool or dialog management tool.

The future does hold Natural Language Generation (NLG) where the script or bot responses are generated in real-time when responding to the user. But a large amount of fine-tuning will be required for this to be successful, reliable and scaleable.

Once messages become part of an application it is usually imbedded in the call-flow, into individual nodes and needs to be found there and edited in the flow.

An example of messages imbedded in the call-flow.

An example of messages imbedded in the call-flow, and this is the common way of doing it. There might be options to export the call-flow into JSON format, and from there the messages can be viewed and edited in bulk. But this is not elegant or practical.

Abstracting Scripts & Messages

Nuance Mix

There are few frameworks who have recognised this problem and subsequently introduced a message abstraction option. As seen below, in Nuance Mix, in the Design tab, there are options for:

  • NLU
  • Variables
  • Messages
  • Events and
  • Data.
In Nuance Mix, in the Design tab, there are options for NLU, variables, Messages, Variables, Events and Data

When messages are selected, the bot response messages are displayed and the options exist to disable or enable barge-in, and to define text or Voice related messages.

Barge-in is especially useful for voice applications, in an attempt to manage turn-taking and facilitate dialog management.

Voice applications also mandate shorter messages, more distilled and to the point. This approach by Nuance Mix allows for copywriters to be integrated into the workflow and development environment.

Microsoft Bot Composer

Another example is Microsoft Bot Composer.

In composer documentation it is referred to as language generation (LG). Not to be confused with Natural Language Generation.

LG in Composer approach message abstraction as an expanded template with enhanced entity substitution. If entity substitution is leveraged and implemented extensively, bot messages are more descriptive, personalised and variant.

You can provide “one of” variation for expansion as well as conditionally expand a template.

Output from LG can be

  • simple text string,
  • multi-line response, or a
  • complex object payload.

Here is an example, of a simple language generation template with text values and variations. When multiple responses are defined in the template, a single response will be selected at random.

To enable a dialog to see the templates it contains, select Show code on the upper right corner of the Bot Responses page to enable editing, then add [import](common.lg) in that dialog's template (See screenshot below).

To enable a dialog to see the templates it contains, select Show code on the upper right corner of the Bot Responses page to enable editing, then add [import](common.lg) in that dialog's template (See screenshot below).

Kore AI

Kore AI has as part of their implementation Standard Responses for Statements, Queries, Error & Warnings, Questions and Choices. These are a few default text responses abstracted here.

Kore AI has as part of their implementation Standard Responses for Statements, Queries, Error & Warnings, Questions and Choices. These are a few default text responses abstracted here.

There is a section which lists the messages and gives an explanation to when each response will be triggered.

Conclusion

Seeing a list of bot response messages and conditions under which these responses are displayed help immensely with getting an overview of the bot’s scope. Updating the responses for changes in products and services are easier, and opportunities for imbedding entities to the maximum in order to make the messages as personalised and generated as possible is immense.

Creating additional abstraction option might seem like added complexity, but process and way-of-work will surely be simplified.

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

No responses yet