Cognigy’s Implementation Of LLMs Turns The Pressure Up For Other Conversational AI Vendors

Their support for generative AI-powered workflows goes beyond anything seen yet on the market.

Cobus Greyling
6 min readJan 27, 2023

--

A few days ago I wrote an article on how LLM-powered features implemented by conversational AI vendors have so far converged on the more “obvious” and quick-to-implement use-cases (like training example generation, and intent fallbacks).

Well, that was before Cognigy demonstrated their LLM implementations.

Now, I would strongly argue that Cognigy is leading the way in LLM implementation for Conversational AI Frameworks.

I get the sense other frameworks considered different LLM use-cases, and how it could be added to their framework. Whilst with Cognigy, experience was front of mind:

  1. Developer Experience
  2. End-user Experience
  3. Agent Experience

The 3️⃣ hallmarks I see in Cognigy’s approach are:

  1. The Cognigy framework absorbs the LLM functionality in a very seamless fashion. They do not have a bolt-on approach like most other frameworks.
  2. Cognigy is exploring how AI can assist humans, with specific focus on bot developers and customer representatives.
  3. Cognigy is not only leveraging Generative AI, but also Predictive AI, especially in the case of advanced entity detection.
Source

In my opinion, the three key questions the Cognigy team answered are:

✪ How to create an AI assisted bot building experience which works?

✪ How can the user experience be improved on a bot and agent level?

✪ How can voicebots become mainstream and loved by users?

⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️

Here Are The Cognigy Use-Cases:

I have to state, that the demos did not seem to be pre-recorded, but live.

What made it all the more impressive was that the functionality is multi-lingual and seamlessly embedded into the current UI.

Automatic Flow Generation

This is by far the most innovative implementation of generative AIthat I’ve seen so far. With Cognigy you will be able to enter a name for your flow together with a short description of what is required.

What I find impressive is the creativity in generating the design. In terms of input variations based on the data type; for instance quick replies. Variables, code nodes and API integration points are defined, guiding the developer on the ambit of the flow.

A flow can also be generated from an example conversation/dialog which can be uploaded.

Even a more technical call flow was shown in the demo; a BASE64 conversion bot with complete JavaScript code and more was autogenerated with a short description of the requirement.

Flow Co-Pilot

This functionality was not demonstrated as it is still in the works. The honesty and transparency of Cognigy was refreshing, they could have presented a very impressive demo of co-pilot, but instead levelled with the audience by saying it is not ready yet.

Flow Co-Pilot is like an auto complete for flow building which considers the current flow structure. It works with logic nodes, but more importantly can generate code-nodes for users and API placeholders.

Intent Training Examples

This feature is offered by most Conversational AI Frameworks (CAIFs) which implemented LLMs. This is where training phrases are generated for each intent. Other frameworks does it by leveraging the intent name or a single training example.

The approach from Cognigy is different and very innovative.

The adding of a short intent description gives the Generative AI more context to leverage, in generating examples.

Lexicons

In the Cognigy world, Lexicons are collections of domain-specific key phrases (also known as Entities) that can be attached to a Flow.

This is a slightly more complex scenario than intent examples. But it was demonstrated that with a Lexicon name, a short description, and the number of examples required, a list of lexicon entries can be created. With key phrases, slots and synonyms.

Source

Considering the image below, Cognigy is also leveraging LLMs for enhanced entity extraction. Again LLMs are playing a supporting role by enhancing the existing NLU pipeline.

⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️

AI Enhanced Bot Messaging

AI enhanced output allows for generative output to be adjusted to the context of the conversation, as the image below clearly illustrates.

The level of creativity can be set together with some contextual settings which can further enhance the user experience.

Phone-based Voicebot

The demonstration of the phone-based voicebot was on a level I have only seen with Voca AI, which was recently acquired by Snap. The caller could self-correct, digress, rephrase etc, with the voicebot matching the user at every step. The resilience of the voice interface is astounding and bodes well for Cognigy to be a leader if voicebot orchestration.

Source

Below is a screenshot of the agent desktop, this is a good example of AI assisting the agent, with the bot listening in on the call, while:

  • transcribing the users speech into text and displaying it,
  • generating a response script for the user, with related information,
  • And these AI features are blended seamlessly with other agent desktop elements.
Source

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

Responses (1)