Photo by Kipras Štreimikis on Unsplash

The Emergence Of Entity Structures In Chatbots

And Why It Is Important For Capturing Unstructured Data Accurately & Efficiently

Cobus Greyling
7 min readAug 20, 2020

--

Introduction

Looking at the chatbot development tools and environments currently available, there are three ailments which require remedy:

  • Compound Contextual Entities
  • Entity Decomposition
  • Deprecation of Rigid State Machine, Dialog Management

In this story I am going to only focus on Entities. And how entities are managed in three Conversational AI environments:

Entities 101

What is an Entity?

The three words used most often in relation to chatbots are:

  • utterances,
  • Intents and
  • Entities.

An utterance is really anything the user will say. The utterance can be a sentence, in some instances a few sentences, or just a word or a phrase. During the design phase it is anticipate what your users might say to your bot.

An Intent is the user’s intention with their utterance, or engagement with your bot. Think of intents as verbs, or working words. An utterance or single dialog from a user needs to be distilled into an intent.

Microsoft LUIS — New Machine Learned Entities — Decomposition Of Entities

Entities can be seen as nouns, often they are referred to as slots. These are usually things like date, time, cities, names, brands etc. Capturing these entities are crucial for taking action based on the user’s intent.

Think of a travel bot, capturing the cities of departure, destination, travel mode, price, dates and times are at the foundation of the interface. Yet, this is the hardest part of the conversational interface. Keep in mind the user enters data randomly and in no particular order.

Compound and contextual entities are being implemented by more chatbot platforms. The option to contextually annotate entities are also on the rise. Examples here are Rasa, IBM Watson Assistant and to a lesser degree Amazon Lex.

Compound & Contextual Entities

Huge strides have been made in this area and many chatbot ecosystems accommodate these.

Contextual Entities

The process of annotating user utterances is a way of identifying entities by their context within a sentence.

Contextual Entity Annotation In IBM Watson Assistant

Often entities have a finite set of values which are defined. Then there are entities which cannot be represented by a finite list; like cities in the world or names, or addresses. These entity types have too many variations to be listed individually.

For these entities, you must use annotations; entities defined by their contextual use. The entities are defined and detected via their context within the user utterance.

Compound Entities

The basic premise is that users will utter multiple entities in one sentence.

Users will most probably express multiple entities within one utterance; referred to as compound entities.

In the example below, there are four entities defined:

  • travel_mode
  • from_city
  • to_cyt
  • date_time
Rasa: Extract of NLU.md File In Rasa Project

These entities can be detected within the first pass and confirmation solicited from the user.

Let’s have a look at more complex entity structures and how they are implemented in Rasa, Microsoft Luis and Amazon Alexa…

Rasa

Contextual & Compound

One of Rasa’s strong points all along were compound and contextual entities.

Contextually means that entities are not recognized by the chatbot by asking the user directly for the input, or found via a finite lookup list. But rather entities are detected based on their context within the utterance or sentence.

This is closer aligned with how we as humans detect entities in a conversation.

Rasa ~ Compound & Contextual Entities in Rasa-X

Compound entities mean I can capture multiple entities per intent, or user utterance. In a scenario where the user gives you all the information in one utterance, you have the ability to capture all those values in one go.

This translate into fewer dialog turns and a more efficient chatbot.

Entity Roles

The starting point of entities are that you can add labels to words. Hence you can define concepts in your data.

In the example below, you have different city types defined with other entities.

This is not elegant, as multiple entities need to be created for one real-word object, namely city.

And in this example, city have two roles; the city of departure and the city of arrival. With Rasa you are able to define these entities with specific roles in your project’s nlu.md file.

The output looks like this:

Entity Groups

This feature allows for entities to be grouped together with a specific group label. The best way to explain this is with an example…

Again, defined in your /data/nlu.md file:

And the output from Rasa NLU:

Microsoft LUIS

Decomposition

Machine Learned Entities were introduced to LUIS November 2019. Entity decomposition is important for both intent prediction and for data extraction with the entity.

We start by defining a single entity, called

  • Travel Detail.

Within this entity, we defined three sub-entities. You can think of this as nested entities or sub-types. The three sub-types defined are:

  • Time Frame
  • Mode
  • City

From here, we have a sub-sub-type for City:

  • From City
  • To City
Defining an Entity With Sub-Types which can be Decomposed

This might sound confusing, but the process is extremely intuitive and allows for the natural expansion of conversational elements.

Data is is presented in an easy understandable format. Managing your conversational environment will be easier than previously.

Adding Sub-Entities: ML Entity Composed of Smaller Sub-Entities

Now we can go back to our intent and annotate a new utterance. Only the From City still needs to be defined.

Annotating Utterance Example with Entity Elements

Here are the intent examples, used to train the model with the entity, sub-types, and sub-sub-types; fully contextualized.

Annotated Intent Examples

Alexa Conversations

The five build-time components of Alexa Conversations are:

  • Dialogs
  • Slots
  • Utterance Sets
  • Response Templates
  • API Definitions

Regarding entities, Conversations have a similar option, though not as complete and comprehensive as LUIS or Rasa. Within conversations you can define entities, which Amazon refers to as Slots.

Two Types of Slots: Value Slots and Properties

Slots

Alexa Conversations introduces a new slot type custom with properties (PCS).

Within Alexa Conversations you can create a slot with multiple properties attached to it. These properties can be seen as sub-slots or sub-categories which together constitute the higher order entity.

Constituting a collection of slots which are hierarchical. This can be used to pass structured data between build-time components such as API Definitions and response templates.

Again, slots are really the entities you would like to fill during the conversation. Should the user utter all three required slots in the first utterance, the conversation will only have one dialog turn.

Two Types of Slots: Value Slots and Properties

The conversation can be longer of course, should it take more conversation turns to solicit the relative information from the user to fill the slots. The interesting part is the two types of slots or entities. The custom defined slots with values, and the one with properties.

Conclusion

As chatbot platforms grow and mature development in these key areas will be inevitable. The challenge will be to present these elements in a simple and easily manageable fashion.

--

--