The Chatbot Development Cycle

How to Design, Develop, Deploy and Continuously Improve your Assistant

Cobus Greyling

--

Chatbots Fitting into the Greater Scheme of Things

Chatbots need to be seen as analogous to traditional IVR systems. Traditional DTFM interactive voice response systems were employed to offer some form of self-service to the caller prior to the call potentially reaching an agent. Success rates in terms of call deflection can range from 40% to as high as 80%; depending on the use-case.

Added to this, some IVR systems act as an Auto-Attendant, where the user can route their own call based on menu options and by pressing a key.

Chatbots should be seen as ITR (Interactive Text Response) systems, acting as conversation deflection away from live agent chats. Many customers are currently complaining about long wait times for live agent chats. With some banking apps having clients wait up to 45 minutes or longer for conversations. This is where chatbots can play a critical role in self-service.

Automatic categorization of conversations must be seen as analogous to an auto attendant. Without the customer explicitly stating the category of their conversation, the chatbot can detect the category and route the conversation to a relevant agent.

Language can be implicitly detected and routed to the correct language skilled agent.

Having this perspective, shows how critical an upfront conversational interface is to a live agent chat strategy.

Continues Process of Development, Testing and Deployment

Step 1

Define a narrow set of key customer needs you want to incorporate in your assistant. Think of business processes which can be completed within the chatbot. But, do not confine yourself to only processes which can be completed. Think of data which the assistant can collect on behalf of the agent prior to handing over to an agent. The aim is to automate the menial tasks the agent might face.

The domain of your conversational interface must be narrow. Don’t be apprehensive if you are starting small.

Step 2

Create intents which define the actions you have decided on in step one. Placing an order might be #Place_Order. Collections can be grouped under #Collections.

Think of intents in the following two ways…The first, if a representative is behind a counter, and a customer walks up to the counter, the first action the representative performs is intent discovery. What is the intent of the customer facing them. Also, think of Google search as the greatest intent discovery machine. The main aim of Google is discover the intention of your search.

Intents are purposes or goals that are expressed in a customer’s input, such as answering a question or processing a payment. By recognizing the intent expressed in a customer’s input, the assistant can choose the correct dialog flow for responding to it.

A great insight for defining intentions is to look at current conversations. If any form of current customer dialog with agents via chat is available, intents can be gleaned using this information.

Step 3

Usually a dialog flow needs to be defined. Different dialogs, which are invoked based on the intent selected. This is unfortunately still one of the more rigid elements of a conversational interface.

IBM Watson Assistant Dialog Flow

Step 4

Create your entities. If intents are verbs, then entities are nouns. Entities are the data you want to capture during a conversation; places, dates, times, names etc.

Current conversations can be mined to see what data is captured by the agents during live agent chat. Data can also be captured by the conversational interface and then context can be transferred to the agent to close off the transaction, after the data has been collected by the bot.

Step 5

Test your bot incrementally as you go along in the development process. Do not leave testing to the very end. As you complete a section, go ahead and test it.

Test Your Bot and Improve the Dialog

Most development environments have a pane or test environment and usually you can make amendments and corrections in this test environment.

Step 6

In this Example the Medium is WhatsApp

Now you can deploy your chatbot. This step is often deemed as menial, but in most cases it takes the most work. The chatbot needs to be hosted somewhere, and three integration touch points need to be addressed. The one touch point is the medium. The bot needs to be integrated to Facebook Messenger, Slack, WhatsApp…the medium of your choice. Secondly the chatbot requires integration to a data source, typically your companies CRM, Ticketing System etc. And thirdly, the integration to the live agent interface, for the dialog hand-off to agents.

Step 7

After deployment, create a snapshot, and save this version of your chatbot. By doing this, you are exercising due diligence and there might be instances where you need to restore your service to this snapshot.

There will be instances where you make changes, which results in a degradation of your chatbots efficiency. In instances like this, you want a marker to roll back to. Hence always follow a measured approach.

Step 8

Our chatbot is still in the development stage. Use metrics at your disposal to measure your bots performance as a focus group tests the interface. You will need to ensure that people not only test the “happy path” but also outliers in terms of use cases.

Step 9

Once you are fairly satisfied with your conversational interface, deploy it into production. A soft launch is always advisable, allowing the usual customers to interact with the bot.

Step 10

Continuous monitoring of conversation logs is important to get a gist of the interface’s performance.

It is important to keep in mind that the process of analyzing logs and improving the dialog is an ongoing process. Having a once-a-week meeting to discuss the dialog and update the intents, entities, dialog flow or script is crucial. Part of this process can be to see which functions customers are asking for, which might not exist. The chatbot can be expanded based on these requirements.

One might look at this process as informed supervised learning.

One last thing…

Often not enough time is spent on the script, the actual wording with which the chatbot responds. This wording informs the user on what is possible, what functions are available, and what their next course or action might be.

If the dialog is not informative enough, the user might not know what the appropriate action is.

Photo by Andrew Leu on Unsplash

--

--