Photo by Damir Kopezhanov on Unsplash

How To Create An Opensource NLU API With Rasa

You Have Access To An Exceptional Natural Language Understanding Tool

Cobus Greyling
5 min readSep 20, 2020

--

Introduction

There is much hype regarding chatbots and conversational AI in general.

Rasa NLU API called from Postman

Technologies are often compared to each other in order to find the best fit for an organization or task.

The problem is that with all the technology available, the initial decisions are easy to make.

The subsequent design & architecture decisions are harder, once the system is established.

In this article I would like to focus on using an opensource option which will serve well for a seed-project.

But more then that, a framework which can evolve and support a fully fledged enterprise solution.

Starting with the NLU API will lay a good foundation to grow the solution into a fully fledged conversational interface.

NLU & Core

There are a few chatbot platforms which have a clear separation between the NLU portion and the dialog management and integration portions. This allows for the development of a stand-alone NLU API.

Rasa chatbot architecture with NLU portion marked.

The Rasa architecture gives you the opportunity to have a NLU API which can also be used for natural language understanding tasks not related to live conversations. This includes conversations archived on email, live agent conversations etc.

NLU Data Format

There are exceptionally good videos by Rasa on how to install in different environments.

Here you can find a video by Rachel for Widows 10 installation.

Secondly, to get started, Rasa has a series of Masterclass videos which is a time efficient way of getting up to speed with the whole Rasa stack.

But, back to creating a NLU API…

The NLU.md file within the Rasa file structure

The examples below used for NLU training are based on a GitHub project.

The NLU training file has the entities annotated within the intents. Hence we see the healthy trend of intents and entities merging.

Intents and entities are in a text file in markdown format.

Intents & Entities

Simple Examples

The simplest example is just having an intent defined with some examples sans any entities. Below is the check_balance intent with a few examples.

Let’s add an entity of type account_type with the example credit card.

The result:

The intent is check_balance and the entity is account_type of credit.

Regular Expressions

This is how regular expressions are defined within the nlu.md file…

And the output…

The intent is marked and the entity of accountNumber is extracted.

Normalize Data

Data captured in the entity can be normalize. The entity related to to credit cards accounts is captured under the entity of account_type, but always with the value of credit.

And the result…

The input of creditcard accout is normalized to credit in the account_type entity.

Here you see the rather skewed input of creditcard accout is normalized to the value of credit in the account_type entity.

Models

Each time you train your model, a new model file is created in the models folder. This allows you to change between models for your API, roll back to previous versions etc.

Different trained models can be invoked when running the API

Creating The API

With a single command the API is launched on port 5005. But first, make sure you have activated your anaconda virtual environment with:

My virtual environment is called rasa2.

Run the API:

You can access the API on the URL

Interact with your API via a client like Postman.

Sending a JSON query string to the Rasa API

Conclusion

There are other features also available when defining the training data. The features mentioned here are the primary ones, IMO. In a following article I want to look at the structures available to entities.

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

Responses (1)