Large Language Models (LLMs) Will Not Replace Traditional Chatbot NLU…For Now

Traditional NLU pipelines are well optimised and excel at extremely granular fine-tuning of intents and entities at no significant cost and rapid iterations.

Cobus Greyling
5 min readFeb 8, 2023

--

Introduction

LLMs are currently making inroads into existing Conversational AI Frameworks (CAIFs).

The generative and predictive power of LLMs are undeniable and virtually all Conversational AI Frameworks have introduced some form of LLMs…or detailed their future intentions.

These inroads are primarily on a generative front and are not focused on predictive functionality.

The primary use of NLU within chatbot frameworks are of a predictive nature, where user utterances are assigned to predefined classifications and intents. And, extracting entities which range from named entities to industry/vertical specific fine-tuned entities.

Considering NLU, below are a few practical considerations on the existing and future importance of NLU. And the fact that NLU can also be used in isolation for off-line processing of conversational data.

✅ Efficient Open-Sourced NLU Pipelines

Most CAIFs have generic, internal NLU pipelines which is most probably based on open-sourced software with no licensing requirements or third-party commitments.

For example, with Rasa a powerful open-sourced NLU API can be created which also support intents with structure, and different entity types. The standard install NLU pipeline can be used, while the pipeline is highly configureable.

There is no need for large amounts of training data and computing power, training time is fairly fast and there are a number of light-weight local install options.

Considering minority human language, you can read more about Rasa’s BytePairFeaturizer here.

⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️

✅ Built-In Efficiencies For Intents & Entities

Over time significant efficiency and structure has been built into intents and entities.

  • Nested intents or sub-intents are implemented by the leading Gartner CAIFs.
  • Intents are flexible and in some cases it can be split or merged via a drag-and-drop UI.
  • Entities are associated only with certain intents. The coupling between an intent and certain entities creates a scenario where two checks are performed prior to the chatbot responding.
  • Structure in entities include Compound Contextual Entities, Entity Decomposition, entity groups, roles, etc.
  • Accurate entity detection is crucial to fulfilling an order and not having to re-prompt a user for data they have already entered.

✅ Training Time & No-Code Environments

With LLMs, data formatting and transformation can be tedious and tricky, and is typically performed via a pro-code environment.

NLU relies on only a few training examples, and generally the NLU management console is a no-code studio environment.

Frameworks like Rasa and Cognigy have introduce incremental training while IBM Watson Assistant has shortened their NLU training time drastically in recent times.

✅ Comparable Results between LLMs & NLU

When performing NLU-like tasks with a LLM, the results yielded by NLU engines and LLMs are in many cases comparable, with the NLU results often being more consistent and predictable.

But the above mentioned scenario is where the LLM is not implemented according to its strengths and NLU is optimised for creating a classification model on a small amount of data.

✅ Consistency with NLU

NLU yields consistent and predictable results with a low to no variation in results, when the same data is submitted.

As seen in the image below, when testing different LLM providers from a zero to few shot learning perspective, it does seem like OpenAI yields the best results, followed by AI21 and Cohere.

In most instances, with LLMs like Goose AI and Bloom it has been hard to generate consistent and accurate content.

In Conclusion

LLMs and NLU should be seen as two different technologies. And as I said in the title…for now.

However, I predict this will change over time and LLMs will supersede more of the NLU territory…

One case in point is Cohere’s new no-code Dashboard where data can be uploaded and intents trained on their LLM; among other features. This no-code environment is starting to strongly resemble typical NLU no-code interfaces.

⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

https://www.linkedin.com/in/cobusgreyling
https://www.linkedin.com/in/cobusgreyling

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com