Photo by Mantas Hesthaven on Unsplash

Your Chatbot Should Accommodate These Six Entity Types

Capturing Entities Efficiently Is the Holy Grail of Chatbot Design

Cobus Greyling
6 min readOct 29, 2019

--

Introduction

Here is the challenge, your chatbot presents the user with the allure of entering unstructured conversational data, in natural language. It is very enticing for users, the charm of speaking to a computer interface and being understood.

For the first time the shoe is on the other foot, the computer needs to figure the humans out, and not the other way round.

What is an Entity?

The three words used most often in relation to chatbots are utterances, Intents and Entities.

An utterance is really anything the user will say. The utterance can be a sentence, some instances a few sentences, or just a word or a phrase. During the design phase it is anticipate what your users might say to your bot.

An Intent is the user’s intention with their utterance, or engagement with your bot. Think of intents as verbs, or working words. An utterance or single dialog from a user needs to be distilled into an intent.

Entities can be seen as nouns, often they are referred to as slots. These are usually things like date, time, cities, names, brands etc. Capturing these entities are crucial for taking action based on the user’s intent.

Think of a travel bot, capturing the cities of departure, destination, travel mode, price, dates and times is at the foundation of the interface. Yet, this is the hardest part of the Chatbot. Keep in mind the user enters data randomly and in no particular order.

The list of entity types we will be looking at is:

· Simple Entities
· Composite Entities
· Entity Roles
· Entity Lists
· Regular Expressions
· Prebuilt Models

Simple Entities

If you have to extract a simple job name, or city name, you really want to be able to extract that entity based on the context within it is expressed. This is particularly handy if you have an infinite list of possibilities. We as humans detect entities based on the context we detect and hence we know where to pick out a city name; even though we have never previously heard the city name.

Tutorial: Example of Simple Contextual Entities

Again, the Chatbot framework you are using must be able to detect the entity contextually. If your development framework relies on a finite list of entities, it will run into trouble the moment requirements grow.

The purpose of the simple (contextual) entities is to teach your NLU where it can be found in an utterance. The part of the utterance that is the entity can change from utterance to utterance based on word choice and utterance length. But your NLU needs examples of entities across all intents that use job names.

The simple entity is a good fit for this type of data when:

· Data is a single concept.
· Data is not well-formatted such as a regular expression.
· Data is not common such as a prebuilt entity of phone number or data.
· Data is not matched exactly to a list of known words, such as a list entity.
· Data does not contain other data items such as a composite entity or contextual roles.

Composite Entities

Tutorial: Example of Composite Entities with Related Data

Not many Chatbot development environments allow for this…but what if you wanted to capture two entities which are related…referred to as a composite entity. Say for instance you have to manage the location of employees, and you have a list of offices and campuses.

Obviously the campus and the office are linked. Perhaps there are a few offices at a campus. Hence here is a scenario of a parent and child entities you need to capture together with the added bonus of not having to ask the user two questions in a fixed journey fashion.

So you would have a “Campus” entity, also an “Office” entity. But you will want to have them linked; hence composite entities. Your NLU returns an entities array. Each entity is given a type and value.

To find more precision for each child entity, use the combination of type and value from the composite array item to find the corresponding item in the entities array.

Entity Roles

Tutorial: Example of Entity Roles

What if you could extract contextually related information from an utterance? A role or roles can be used when the entity data to extract:

· Is related to each other in the context of the utterance.

· Uses specific word choice to indicate each role. Examples of these words include: from/to, leaving/headed to, away from/toward.

· Both roles are frequently in the same utterance, allowing your NLU model to learn from this frequent contextual usage.

· Need to be grouped and processed by client app as a unit of information.

A good example of entity roles is, when you have people traveling and having a “to” and “from” entity. Entity Roles allow for the detection of these two entities on the go. This is really an entity with multiple roles. A Geography entity can have a To and a From role; or a Name entity can have a Boy and a Girl role.

Entity Lists

Tutorial: Example of List Entities

See this as a drop-down list for entities, where synonyms are matched and the result normalized.

A list entity is an exact text match to the words in the utterance.

Each item on the list can include a list of synonyms. For an example, a company department can be identified by several key pieces of information such as an official name, common acronyms, and billing department codes.

A list entity is a good choice for this type of data when:

~The data values are a known set.

~The set doesn’t exceed the maximum LUIS boundaries for this entity type.

~The text in the utterance is an exact match with a synonym or the canonical name.

LUIS doesn’t use the list beyond exact text matches. Stemming, plurals, and other variations are not resolved with just a list entity. To manage variations, consider using a pattern with the optional text syntax.

Regular Expressions

Tutorial: Example of Using Regular Expressions

There will be scenarios where you need to extract consistently-formatted data from an utterance using the Regular Expression entity. The demo below use a regular expression entity to pull out well-formatted Human Resources (HR) form numbers from an utterance.

I like the way the intents and entities are linked; once the entity is defined, it is picked up automatically with the creation of the intent examples.

While the utterance’s intent is always determined with machine-learning, this specific entity type is not machine-learned.

Prebuilt Models

Tutorial: Example of Prebuilt Entity Models

Various NLU environments allow for prebuilt Entities to be added to an app to quickly gain intent prediction and contextual data extraction.

You do not need to mark any utterances with prebuilt entities because the entity is detected automatically. I find it so intuitive that the utterances (intents) detect the entities automatically.

This saves you the trouble of manually selecting and assigning the entities. From the video it is clear that the two entities being name and place are detected contextually. The name is not detected via a finite list; or a fuzzy matching of names; but in a true contextual manner.

Prebuilt models (domains, intents, and entities) help you build your model quickly.

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com