The Emergence Of Entity Structures In Chatbots

And Why It Is Important For Capturing Unstructured Data Accurately & Efficiency

Introduction

Looking at the chatbot development tools and environments currently available, there are three ailments which require remedy:

  • Compound Contextual Entities
  • Entity Decomposition
  • Deprecation of Rigid State Machine, Dialog Management

In this story I am going to only focus on Entities. And how entities are managed in three Conversational AI environments:

Entities 101

What is an Entity?

The three words used most often in relation to chatbots are:

  • utterances,
  • Intents and
  • Entities.

An utterance is really anything the user will say. The utterance can be a sentence, in some instances a few sentences, or just a word or a phrase. During the design phase it is anticipate what your users might say to your bot.

An Intent is the user’s intention with their utterance, or engagement with your bot. Think of intents as verbs, or working words. An utterance or single dialog from a user needs to be distilled into an intent.

Microsoft LUIS — New Machine Learned Entities — Decomposition Of Entities

Entities can be seen as nouns, often they are referred to as slots. These are usually things like date, time, cities, names, brands etc. Capturing these entities are crucial for taking action based on the user’s intent.

Think of a travel bot, capturing the cities of departure, destination, travel mode, price, dates and times are at the foundation of the interface. Yet, this is the hardest part of the conversational interface. Keep in mind the user enters data randomly and in no particular order.

Compound and contextual entities are being implemented by more chatbot platforms. The option to contextually annotate entities are also on the rise. Examples here are Rasa, IBM Watson Assistant and to a lesser degree Amazon Lex.

Compound & Contextual Entities

Huge strides have been made in this area and many chatbot ecosystems accommodate these.

Contextual Entities

The process of annotating user utterances is a way of identifying entities by their context within a sentence.

Image for post
Image for post
Contextual Entity Annotation In IBM Watson Assistant

Often entities have a finite set of values which are defined. Then there are entities which cannot be represented by a finite list; like cities in the world or names, or addresses. These entity types have too many variations to be listed individually.

For these entities, you must use annotations; entities defined by their contextual use. The entities are defined and detected via their context within the user utterance.

Compound Entities

The basic premise is that users will utter multiple entities in one sentence.

Users will most probably express multiple entities within one utterance; referred to as compound entities.

In the example below, there are four entities defined:

  • travel_mode
  • from_city
  • to_cyt
  • date_time
Image for post
Image for post
Rasa: Extract of NLU.md File In Rasa Project

These entities can be detected within the first pass and confirmation solicited from the user.

Let’s have a look at more complex entity structures and how they are implemented in Rasa, Microsoft Luis and Amazon Alexa…

Rasa

Contextual & Compound

One of Rasa’s strong points all along were compound and contextual entities.

Contextually means that entities are not recognized by the chatbot by asking the user directly for the input, or found via a finite lookup list. But rather entities are detected based on their context within the utterance or sentence.

This is closer aligned with how we as humans detect entities in a conversation.

Image for post
Image for post
Rasa ~ Compound & Contextual Entities in Rasa-X

Compound entities mean I can capture multiple entities per intent, or user utterance. In a scenario where the user gives you all the information in one utterance, you have the ability to capture all those values in one go.

This translate into fewer dialog turns and a more efficient chatbot.

Entity Roles

The starting point of entities are that you can add labels to words. Hence you can define concepts in your data.

In the example below, you have different city types defined with other entities.

## intent:travel_details- I want to travel by [train](travel_mode) from [Berlin](from_city) to [Stuttgart](to_city) on [Friday](date_time)

This is not elegant, as multiple entities need to be created for one real-word object, namely city.

And in this example, city have two roles; the city of departure and the city of arrival. With Rasa you are able to define these entities with specific roles in your project’s nlu.md file.

## intent:travel_details
- I want to travel by [train](travel_mode) from [Berlin]{"entity": "city", "role": "depart"} to [Stuttgart]{"entity": "city", "role": "arrive"} on [Friday](date_time)

The output looks like this:

I want to travel by train from Berlin to Stuttgart on next week Wednesday.
{
"intent": {
"name": "travel_details",
"confidence": 0.9981381893157959
},
"entities": [
{
"entity": "travel_mode",
"start": 20,
"end": 25,
"value": "train",
"extractor": "DIETClassifier"
},
{
"entity": "city",
"start": 31,
"end": 37,
"role": "depart",
"value": "Berlin",
"extractor": "DIETClassifier"
},
{
"entity": "city",
"start": 41,
"end": 49,
"role": "arrive",
"value": "Stuttgart",
"extractor": "DIETClassifier"
}
],
"intent_ranking": [
{
"name": "travel_details",
"confidence": 0.9981381893157959
},

Entity Groups

This feature allows for entities to be grouped together with a specific group label. The best way to explain this is with an example…

Again, defined in your /data/nlu.md file:

## intent:teams
- The first team will be [John]{"entity": "teamMember", "group": "1"}, [Mary]{"entity": "teamMember", "group": "1"} and [Geoff]{"entity": "teamMember", "group": "1"} and the second groupto travel will be [Martha]{"entity": "teamMember", "group": "2"}, [Adam]{"entity": "teamMember", "group": "2"} and [Frank]{"entity": "teamMember", "group": "2"}.

And the output from Rasa NLU:

The first team will be John, Mary and Geoff and the second group to travel will be Martha, Adam and Frank.
{
"intent": {
"name": "teams",
"confidence": 0.9999754428863525
},
"entities": [
{
"entity": "teamMember",
"start": 23,
"end": 33,
"group": "1",
"value": "John, Mary",

"extractor": "DIETClassifier"
},
{
"entity": "teamMember",
"start": 38,
"end": 43,
"group": "1",
"value": "Geoff",

"extractor": "DIETClassifier"
},
{
"entity": "teamMember",
"start": 83,
"end": 95,
"group": "2",
"value": "Martha, Adam",

"extractor": "DIETClassifier"
},
{
"entity": "teamMember",
"start": 100,
"end": 105,
"group": "2",
"value": "Frank",

"extractor": "DIETClassifier"
}

Microsoft LUIS

Decomposition

Machine Learned Entities were introduced to LUIS November 2019. Entity decomposition is important for both intent prediction and for data extraction with the entity.

We start by defining a single entity, called

  • Travel Detail.

Within this entity, we defined three sub-entities. You can think of this as nested entities or sub-types. The three sub-types defined are:

  • Time Frame
  • Mode
  • City

From here, we have a sub-sub-type for City:

  • From City
  • To City
Image for post
Image for post
Defining an Entity With Sub-Types which can be Decomposed

This might sound confusing, but the process is extremely intuitive and allows for the natural expansion of conversational elements.

Data is is presented in an easy understandable format. Managing your conversational environment will be easier than previously.

Image for post
Image for post
Adding Sub-Entities: ML Entity Composed of Smaller Sub-Entities

Now we can go back to our intent and annotate a new utterance. Only the From City still needs to be defined.

Image for post
Image for post
Annotating Utterance Example with Entity Elements

Here are the intent examples, used to train the model with the entity, sub-types, and sub-sub-types; fully contextualized.

Image for post
Image for post
Annotated Intent Examples

Alexa Conversations

The five build-time components of Alexa Conversations are:

  • Dialogs
  • Slots
  • Utterance Sets
  • Response Templates
  • API Definitions

Regarding entities, Conversations have a similar option, though not as complete and comprehensive as LUIS or Rasa. Within conversations you can define entities, which Amazon refers to as Slots.

Image for post
Image for post
Two Types of Slots: Value Slots and Properties

Slots

Alexa Conversations introduces a new slot type custom with properties (PCS).

Within Alexa Conversations you can create a slot with multiple properties attached to it. These properties can be seen as sub-slots or sub-categories which together constitute the higher order entity.

Constituting a collection of slots which are hierarchical. This can be used to pass structured data between build-time components such as API Definitions and response templates.

Again, slots are really the entities you would like to fill during the conversation. Should the user utter all three required slots in the first utterance, the conversation will only have one dialog turn.

Image for post
Image for post
Two Types of Slots: Value Slots and Properties

The conversation can be longer of course, should it take more conversation turns to solicit the relative information from the user to fill the slots. The interesting part is the two types of slots or entities. The custom defined slots with values, and the one with properties.

Conclusion

As chatbot platforms grow and mature development in these key areas will be inevitable. The challenge will be to present these elements in a simple and easily manageable fashion.

Written by

NLP/NLU, Chatbots, Voice, Conversational UI/UX, CX Designer, Developer, Ubiquitous User Interfaces. www.cobusgreyling.me

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store