Conversational AI: What Makes Cognigy Different?

And How Cognigy Is Democratizing Conversational AI…

Cobus Greyling
9 min readFeb 3, 2022
Photo by Aldebaran S on Unsplash

Introduction

It is an arduous task writing a review of the Cognigy Conversational AI platform in a single article. These are mere first impressions and general observations.

The Cognigy elements which demand a deep dive are: NLU Connector Editor, Functions, Intent Trainer, Tokens, Digression, NLU Breakout and more.

During the last week, the IDC marketScape and Gartner released Conversational AI Platform Assessments. Looking at the segmentation below, Category 2 was always seen as the leading platforms or frameworks. Followed by Category 1.

A four category segmentation of the Conversational AI framework landscape.

Form the reports, it was evident that frameworks from Category 3 are leading the pack, and in particular Cognigy.

Chatbot development platforms and frameworks can be divided into four broad categories, roughly…

Category 1

The open source, more technical NLP tools and chatbot development frameworks. Typically, these tools:

  • Can be installed anywhere
  • Has open architecture
  • Open Source
  • No or limited GUI
  • Configuration file and pro-code focused
  • Machine Learning Approach
  • Higher barrier to entry
  • Scales well with the right skills
  • Demands astute technical planning for installation & operational management.
  • Often used as underlying technology by Category 3 software
  • New features can be developed and the platform enhanced
The languages Cognigy makes provision for. At the bottom a Universal language is visible. This is in keeping with a development amongst the Category 2 frameworks, most notably Microsoft and IBM Watson Assistant.

Above: The languages Cognigy makes provision for. At the bottom a Universal language is visible. This is in keeping with a development amongst the Category 2 frameworks, most notably Microsoft and IBM Watson Assistant.

The Universal language option in Cognigy allows for the inclusion of more than 100 languages; by using a multi-lingual machine learning model approach.

Category 2

  • Often used by large-scale commercial offerings
  • Cloud based.
  • In some instances geographic regions can be selected
  • Seen as safe bets for large organizations
  • Solutions range from pro-code, low-code to no-code
  • Lower barrier to entry
  • GUI focused
  • Little to no insight or control as to what happens under the hood
  • Rigid rule-based dialog state management
  • Cost is most often not negotiable

Category 3

  • These are independent, alternatives for Conversational AI, providing an encapsulated product
  • The technology under the hood is often not made known
  • Independent, alternative solution providers
  • Frequently built using open-source NLP tools
  • Often innovative approaches to the challenges of Dialog State Design, development and management
  • Low-code to no-code approach
  • The possibility of being acquired
  • Price is often more negotiable
  • Feature requests are more likely to be accommodated
  • Lower barrier to entry and to get going

Access To Cognigy

With an email address and no credit card, you get 30 days access to all of Cognigy.

The offering is 100% free with no commitment, and access to all of the AI features.

Easy onboarding is available with comprehensive documentation and tutorials. There are no downloads, plugins to install or features to unlock.

Some of the channels (aka Mediums) which are available for deploying Virtual Agents.

This includes full-power NLU access, more than 20 channels/mediums, Voice Gateway functionality and more.

The Good

And What Does Cognigy Do Differently?

  • Cognigy is available as public SaaS, private SaaS and self-hosted.
  • The ease of access and open approach of Cognigy. No secrets, no stealth mode mumbo-jumbo.
  • The sheer level of functionality, flexibility and settings…it’s a feature and functionality rich environment. In a sense it is overwhelming and show’s why Cognigy is not hiding their product.
  • Exceptional documentation and tutorials.
  • Flow Nodes. This is a new and innovative approach to dialog state management. Flow nodes combined in a certain way, can help create dynamic, flexible, interactive conversations. Nodes, ranging from Basic to Advanced. To some extend this loosens up the state management process. More detail on this later…
  • The execution path is highlighted in green while testing via the conversational test pane. Closing this loop is invaluable. Also, this creates an instant feedback loop between development and testing.
As seen in this simple conversation flow, the execution path is marked green while testing in the dialog test pane.
  • A conversational experience can be created without any NLU component. And other NLU providers can be used, other than Cognigy. This speaks to large enterprise implementation where components like NLU, Dialog State Management, STT and TTS need to be separated or segmented.
The NLU connector options to connect your Cognigy flow to.
  • Asynchronous functions can be kicked-off which post a notification once completed.
  • Integrate to your Cognigy flow from other systems. The REST Endpoint lets you connect to a Cognigy Flow directly through a REST interface. The Webhook Endpoint lets you expose Cognigy Flows via Webhook.
  • Endpoints can be pointed to a specific snapshot. This makes it possible to easily deploy different versions of your Virtual Agent. Snapshots can be imported and exported.
  • Built-in regression and unit testing.
  • A seemingly small touch is assigning a color to a project.
  • Digression is made easy. Jump to a specific node within a flow:
An example of where a node is edited to go go different flow, and specific node within that flow. Different Execution Modes are available, parsing of intents and slots/entities can be toggled on and off.
  • An example of where a node is edited to go go different flow, and specific node within that flow. Different Execution Modes are available, parsing of intents and slots/entities can be toggled on and off.
  • Entities are linked to specific intents:
A slot/entity can be defined by key phrases and synonyms. This entity is subsequently tied to a specific intent.

Below is a practical implementation.

The intent Balances is defined by the example sentences. Within each sentence the slog/entity is contextually defined with the slot name.
  • Nested Intents / Sub Intents
  • Instant testing of your work.

Some Considerations

  • No Cognigy Speech-To-Text is available. For voicebot implementations, a company will have to source a third party solution. This is not surprising considering the sheer amount of data required for a STT model. Added to this, the regional, accent and language diversity.
  • Text-To-Speech is also not available from Cognigy. This is not an impediment, and does not detract from the platform.
  • Below is a typical voice gateway configuration, showing were Cognigy fits in:
A screen-print from the Cognigy documentation. Showing the role of the Cognigy Voice Gateway.
  • When getting familiar with Cognigy, the Cognigy way of doing things need to be embraced. The development environment needs to be approach with an open mind.
  • Like all Category 2 frameworks, the NLU pipeline is not open or configurable. If this is problematic for our implementation, an alternative solution can be used.

Flow Nodes

There are a few things which Cognigy does differently from other environments.

The principle of Flow Nodes is one of them.

Usually within the dialog state management, a dialog node is a standard conversational element where the properties of this element can be set. This is certainly the case with Microsoft Power Virtual Agents and IBM Watson Assistant Actions.

This is a subtle, yet substantial design shift Cognigy made here.

By not having generic conversational or dialog nodes with properties which can be set.

But rather an array of different flow nodes as opposed to dialog nodes.

Flow Nodes can be defined by Function or Extensions. Under functions, 68 different types exist. These 68 Function types are segmented under 6 headings. Extensions have 8 Cognigy Voice Gateway extensions. Mail, MongoDB and SQL Server extensions are also available.

Flow Nodes can be defined by Function or Extensions. Under functions, 68 different types exist.

These 68 Function types are segmented under 6 headings.

Extensions have 8 Cognigy Voice Gateway extensions. Mail, MongoDB and SQL Server extensions are also available.

Nodes have different functionalities within the conversational process.

Execution Path

The execution path of the conversation is highlighted within the dialog flow canvas when testing.

This is ideal to tie the design to the user experience…I see huge benefit here in testing different conversation scenarios in a rapid fashion.

The Cognigy flow design & development console with the test pane on the right and the execution path highlighted in green.

The Cognigy flow design & development console with the test pane on the right and the execution path highlighted in green is visible here.

Flow execution typically starts at the top with the green Start Node . This is where the NLU takes place and the NLU results are subsequently published to the Input object.

A flow is triggered with each new input and starts at the Start Node unless specified differently.

NLU

Cognigy.AI features built-in support for a number of third-party NLU engines. These are:

  • Google Dialogflow
  • IBM Watson Assistant
  • Amazon Alexa
  • Amazon Lex
  • Microsoft LUIS
  • Code (custom)

The intent training module includes a Hierarchy feature that enable intent grouping and layering. This means that parent intents are created which can inherit the example sentences from child intents to provide more flexibility to the agent’s understanding and increase the chances of the correct intent being identified.

A sentence is added, and the key word or phrase is selected. The phrase is defined by a Lexicon Slot Name on the right.

Lexicons can be seen as groupings of domain-specific Keyphrases (Entities) that are in turn linked to a Flow.

A Lexicon can be seen as a dictionary, that allows the Virtual Agent to “understand” specific words or phrases in context.

As soon as a Keyphrase is detected, it is published to the Input object for further use. This process is called Slot Mapping.

Different key phrases defined under Slots with Synonyms.

Cognigy NLU takes some getting use to, with Entities being referred as slots, and defined in the lexicon editor. The Slots can then in turn be linked to intents. Intents can have sub or nested intents.

Compound slots per intent example can be defined.

Compound slots per intent example can be defined. As seen above.

Conclusion

Trying to think what Cognigy’s strong points are, is difficult.

It seems to be a very complete and comprehensive solution. Feature rich and more than adequate tooling in every department.

Another positive of Cognigy, is that it is surely not a clone of other frameworks. A hallmark of this framework is authenticity and a fresh approach. Many concepts are novel and addresses known vulnerabilities in the current market place.

Cognigy takes a measured approach to loosening the fixed state problem of dialog state management with flow nodes. Innovation is evident in intents, entities, conversation flow, deploying, tweaking and testing.

There is also an open architecture approach for those organizations who only requires a certain component from Cognigy.

When it comes to selecting a Conversational AI framework, the solution itself plays a crucial role, obviously. But also factors like the the team’s technical prowess, collaboration needs, release cadences, installation etc. The installation, operational elements and support are also determining factors.

Hence the decision on the best framework will most definitely be influenced by the implementation scenario.

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com