Photo by Shai Pal on Unsplash

The Evolution Of Intents

Different Frameworks & Different Approaches To Intent Implementation

Introduction

During the last two years there has been an evolution with regard to Intents.

This focus on intents is justified, considering that intents is a layer through which conversations need to pass. And at the start of any conversation the intents play a crucial role to categorise the conversation according to an user intent.

For good reason intents have been seen as too rigid, and this strait laced layer for each and very user intent need to be solved for. The approaches followed to can be divided into two categories,

  • intent deprecation and
  • intent augmentation.

The activities to deprecate intents are:

This notion has been limited and not widely implemented, and definitely not a trend with in Gartner leaders. On the contrary, the Gartner leaders are building functionality into intents. Transferring complexity and conversation management out of other areas like dialog state management into the intents.

Below is a list of examples of how intents are being augmented.

Intent Augmentation

The intent component of the NLU is in essence a list of intent names, each intent is represented with a name, and a list of utterances defining the intent.

The movement amongst the Gartner leaders is to augment Intents; hence building structure into intents and blurring the line between intents and the other chatbot components.

Some traits I am seeing:

  • Intents are associated with specific entities.
  • Intents are associated with specific sections of the conversational flow.
  • Weights can be set for intents.
  • Intents can be toggled on and off.
  • Quick-replies can be associated with an intent. Single turn dialogs are contained within the intent.
  • Intents can be dragged, dropped, split, merged.
  • Intents are hierarchal, nested, etc.
  • The notion of themes, representing a collection of one of more intents, entities and flows.
  • Data can be uploaded for intent detection, new utterances are automatically associated with existing intents. Clusters are formed for new intents.
  • Thresholds can be set for intents.
  • Confirmation requests can be set within an intent if a threshold is not met.
  • Intents are leveraged for auto disambiguation menus.
  • Structures exist for importing utterances and testing intents against what users are saying.

(Systems examined include: Cognigy, Kore AI, IBM Watson Assistant, Oracle Digital Assistant, Nuance Mix, AWS Lex V2, Rasa, HumanFirst)

Usually a digital assistant is constituted by one or more skills. These skills are again composed of a dialog state management system and NLU. Dialog state management in turn is composed of managing the dialog state based on conditions and also holds the bot responses.

Intents and entities constitutes the NLU. Intents can be seen as the verbs and entities as the nouns…

Please consider registering at the Eliza Language Technology Community.

🙂It is a Discourse based community with discussions, content and a place to get answers to all your questions form the community!

Oracle Digital Assistant: Intents With Imbedded Answers

  1. This is a contained approach within Oracle Digital Assistant, where the Q&A is encapsulated within the intent; quick reply intents.
  2. The user utterance is assigned to an intent, and the response is imbedded within the intent, acting as a quick reply.
  3. This can be seen as an example where the lines between intents, dialog flow and bot messages are blurred.
  4. The overhead of segmenting the functionality and breaking it up between intents, dialog flow sections and bot messages is negated.
  5. This implementation can be seen as quick reply intents.
  6. Below is an example of an intent called HoursAsk, with the name defined and a description. Subsequently an answer for the intent is defined, which is returned to the user.
Within the HourAsk intent, an answer can be defined which will be presented to the user in the instance where this intent is hit.

Nuance Mix

  • The switching of intents on and off reminds quite a bit of the Cognigy approach.
  • Weights can be added to certain intents, switched on and off, markers show if the intent is assigned and if annotation is pending.
  • Intents can be associated with entities, this tight feedback loop between an intent and one or more entities are an important contributor to accuracy.
  • A dialog can be tested in a test pane, and the conversation is mapped out or shown in realtime in the dialog canvas. This is also the case with Cognigy.
  • While building the prototype, my workspace was split across three tabs, NLU, Dialog development and Project settings. This makes for a convenient extended workspace on multiple screens.

IBM Watson Assistant

  • A list of utterances can be extracted from the support centre chat transcripts or other related, recorded customer conversations within the organisation.
  • You will have to decide on a single recommendation source type.
  • The recommendation source data that you add is used to derive both intent and intent example recommendations.
  • This is an important feature as the data is grouped into intents.
  • And within each of these defined intents, a list is made by Watson Assistant which constitutes the user examples.
  • The name of the intents created can be cryptic. It’s best to rename it to something more intelligible for future reference. Obviously you are at liberty to use these generated values a guide and edit and update them as you see fit.
  • By merely looking at the way Watson Assistant organises the data will already spark many insights and ideas on organising the data.

Conclusion

What is next for Language Technology?

  1. Intents, entities and sections of flow will be combined and tightly coupled for sections of the conversation.
  2. Simple conversational implementations will leverage QnA, quick reply intents and intent-less scenarios.
  3. Knowledge bases will become more agile and adaptable.
  4. Different conversational elements will be orchestrated to create a complete conversational experience. This orchestration will be seamless.
  5. Large Language Models (LLM) will find its way into chatbot technology in the areas of Natural Language Generation, Clustering, classification, Auto completion. Absorbing portions of chatbot architecture.
  6. It will not be possible to displace LLMs and Language Technology companies will find innovative avenues to leverage LLMs.
  7. Fine-tuning of LLM’s will improve considerably in speed and granularity.
  8. Chatbot development frameworks will become settled and opportunity for new entrants will diminish due to high table stakes and fewer opportunities for differentiation.
  9. New vertical areas of growth will emerge which will add considerable value to the current language technology landscape.
  10. The areas of growth will be within the Seven Vertical Vectors. These are:
  • Highly specialised ASR technology.
  • Highly specialised voice and speech synthesis companies.
  • Dialog state design and management. Collaborative design, with design and flow authoring that will merge.
  • Software focussing on personalisation of services. Proactive, highly contextual.
  • Bot testing, quality & QA
  • Data preparation, analysing conversational data. Clustering of data and extracting insights from customer conversations, intent driven development.
  • Conversation and user interaction analysis.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Cobus Greyling

Cobus Greyling

Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; NLP/NLU/LLM, Chat/Voicebots, CCAI. www.humanfirst.ai