15 General Conversational AI Trends & Observations

Gleaning Insights From The Gartner & IDC Leaders


One could argue that traditionally chatbots or conversational AI agents, are constituted by a four pillar architecture.

This architecture is very much universal across commercial chatbot platforms; being:

  • Intents (NLU)
  • Entities (NLU)
  • Bot Responses (aka Script / Bot Dialog)
  • State Machine (Dialog) Management

As seen here, there are two main components;

  • The NLU component and
  • The Dialog Management component.

In the case of Microsoft, Rasa, Amazon Lex, Oracle , etc., the distinction and separation between these two is clear and pronounced. In the case of IBM Watson Assistant, it is not the case.

The NLU component is constituted by:

  • Intents (verb) and
  • Entities (nouns).

The Dialog component by:

  • Bot responses and
  • The dialog state development & management machine.

The two impediments to chatbots becoming true AI agents are intents and state machines.

  • Within a chatbot the first line of conversation facilitation is intent recognition.
  • And herein lies the challenge, in most chatbot platforms there is a machine learning model of sorts used to assign a user utterance to a specific intent.
  • And from here the intent is tied to a specific point in the state machine (aka dialog tree). As you can see from the sequence below, the user input “I am thinking of buying a dog.” is matched to the intent Buy Dog. And from here the intents are hardcoded to dialog entry points.

This strait-laced layer between the NLU ML model and the dialog is intransigent in the sense of being a conditional if-this-then-that scenario which manages the conversation.

The list of Intents are also a fixed and a hardcoded and defined reference within a chatbot. Any conceivable user input needs to be anticipated and mapped to a single intent.

Again, the list of intents is rigid and fixed. Subsequently each intent is linked to a portion of the pre-defined dialog; as mentioned earlier.

The movement in the market place is not to deprecate one or more of these pillars, but to blur the lines between them. We are seeing a merging of sorts, especially with dialog or script management interfaces. Dialog state management components being added to NLU and more.

General Conversational AI Trends & Observations

There are a few general trends and observations taking shape amongst the leading complete Conversational AI platforms…

1️⃣ Structure is being built into intents in the form of hierarchal, or nested intents (HumanFirst & Cognigy). Intents can be switched on and off, weights can be added. Thresholds are set per intent for relevance, a threshold can in some cases be set for a disambiguation prompt. Sub-patterns within intents (see Kore.ai). Kore.ai also has sub-intents and follow-up intents.

2️⃣ Specific intents are linked to specific sections of the flow, and intents & entities are closely linked; thus creating a strong association between intents and specific entities.

3️⃣ Intents can be dragged, dropped, nested, un-nested (HumanFirst).

4️⃣ The proliferation of a design-canvas approach to dialog management. Dialog-management supports (in general) digression, disambiguation, multi-modality, with graphic design affordances. There is focus on voicebot integration and enablement. Currently there are four distinct approaches to dialog state management and development.

5️⃣ The idea of maintaining context across intents and sections of different dialog flows are also being implemented. Especially making a FAQ journey relevant and accessible across journeys and intents.

6️⃣ SaaS approach with local private cloud install options.

7️⃣ Entities can be annotated within intents, compound entities can be defined. And structure is being introduced to entities. In the case of LUIS, machine learning nested entities. Or structure in sense of Regex, lists, relationships etc. Kore.ai has implemented the idea of user patterns for entities.

8️⃣ Script or chatbot dialog prompt management is improving, this is very helpful in the case of multi-lingual conversational agents.

Nuance Mix has a neat user interface to manage messages within the dialog flow. Barge-in can be set on message level. This is very relative to IVR and Voicebots. Text can be defined for a text interface or Voice Assistant. Voice Assistants demands very specific design considerations. Due to the ephemeral nature of voice, there are no visual design affordances. Hence messages need to be shortened.

9️⃣ Ease of use, with a non-technical end-to-end solution approach.

🔟 Design & development are merging, probably contributing to the demise of a design tool like BotSociety.

1️⃣1️⃣ Focus is lent to improving the training time, with solution being offered in the form of incremental training (Cognigy’s quick build, Rasa’s incremental training) or speeding up training time in general (IBM Watson Assistant)

1️⃣2️⃣ Irrelevance detection is receiving focus, from IBM Watson Assistant. It is also an element of the HumanFirst workbench.

1️⃣3️⃣ Negative patterns. These can be identified and eliminated. For instance, a user says “I was trying to Book a Flight when I faced an issue.” Though the machine identifies the intent as Book a Flight, that is not what the user wants to do. In such a case, defining was trying to as a negative pattern, would ensure that the matched intent is ignored. Kore.ai has this as a standard implementation.

1️⃣4️⃣ The idea of Traits is commonplace in Kore.ai. Traits are specific entities, attributes, or details that the users express in their conversations. The utterance may not directly convey any specific intent, but the traits present in the utterance are used in driving the intent detection and bot conversation flows.

1️⃣5️⃣ Using Cognigy’s Universal Language model one can create intents and example sentences in English, and then speak to the agent in German, Japanese or Russian and it will still map the input to the correct intent

Multiple languages are common-place. Generic, general language is also an option in some platforms like Cognigy and Watson Assistant. HumanFirst detects languages on the fly, in a very intuitive manner.

Within IBM Watson Assistant a universal language is available. The universal model, which adapts to any other language you want to support.


As can be seen from these 15 points, the functionality is growing, and the four pillars are being transformed into a number of vertical vectors and specific functionality.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store