Photo by Manouchehr Hejazi on Unsplash

Analysis Of The Gartner Hype Cycle for AI

With Specific Focus On Language Technology Considerations

Cobus Greyling
7 min readJul 3, 2022



It has been close to 10 months since the Gartner Hype Cycle for AI has been released and there are a few considerations I would like to address with regard to Conversational AI.

Considering the Gartner AI Hype Cycle, there are arguably five technologies which are related to Language Technology and Conversational AI, constituting 15% of the report.

These five language related technologies in the hype cycle are:

  1. Semantic Search (2–5 Years from Plateau of Productivity)
  2. Chatbots (< 2 Years)
  3. NLP (5–10 Years)
  4. AI Orchestration (2–5 Years)
  5. Generative AI (2–5 Years)

The order of the five technologies above represents their progress along the hype cycle.

The Gartner Hype Cycle for Artificial Intelligence, September 2021. Source

Above is the Garner Hype Cycle for AI, with five red arrows (added by me) showing the position of each of the Conversational AI technologies.

Obviously some of these technologies will move faster through the hype cycle than others just based on their position and time to maturity. Hence Gartner sees significant acceleration with these technologies.

For instance, Generative AI is at the beginning of the cycle, but 2 to 5 years away from the plateau. NLP is in the trough of disillusionment, 3 cycles ahead, but is 5 to 10 years away from the plateau.

Whilst AI Orchestration has 2 to 5 years to the plateau, but is still stuck in the innovation trigger phase.

Gartner expects that by 2025, 70% of organisations will have operationalised AI architectures due to the rapid maturity of AI orchestration initiatives.

~ Gartner

Conversational AI & The Four Prevailing Trends

As seen in the diagram below, there are four prevailing trends underpinning the AI hype cycle: Responsible AI, Operationalising AI, Efficiency in Data, Models & Compute and Data for AI. These for prevailing trends were identified and highlighted by Garner.

There is significant focus on Responsible AI from the Large Language Model providers like Co:here, OpenAI and AI21labs. This is evident in their documentation and conditions on production implementations and the nature of the application prior to launch.

Considering Operationalising AI

From a conversational perspective the approach to operationalising AI is defined and the use-cases in terms of customer support and customer care are obvious avenues to operationalising AI.

This is accentuated by the relentless focus on voicebots and automating calls to the call centre. The focus on voice is clearly visible with companies like Boost AI, Cognigy, Kore AI, Nuance Mix, IBM Watson Assistant and others.

Turning to Efficiency in data

Gartner expects by 2025, that 70% of organisations will be compelled to shift their focus from big to “small and wide data”, providing more context for analytics and making AI less data-hungry.

There is a general perception that large amounts of data need to be collected in order to get AI projects production ready. From a language perspective, this has been debunked. The Co:here & HumanFirst POC has illustrated how a large language model (LLM) can be leveraged with a relatively small amount of training data.

I have also created a prototype where a chatbot can be bootstrapped using the OpenAI Large Language Model. This utilised unsupervised learning, zero shot and few shot training.

Too frequently chatbot projects are underfunded as a form of mitigation, to the detriment of the project.

~ Robb Wilson (Founder of OneReach AI)

Conversational AI

Semantic Search

Considering the five Conversational AI technologies which are part of the Garner Hype Cycle, Semantic Search is the most advanced in the cycle.

Semantic search finds itself and the inception of the slope of enlightenment.

There are a few ways semantic search is surfacing in customer facing solutions.

Search has become an avenue for rapid bootstrapping of a chatbot implementation.

Oracle Digital Assistant has a few features in their framework leveraging search.

Watson Discovery is an easy way to ingest data and search via natural language.

And Pinecone is banking on democratising semantic search with their vector database. Acting as a connection between a data source and a user interface, where search is executed by a user in natural language.

A significant component of semantic search is the clustering of user utterances. A number of chatbot frameworks have ventured into clustering semantically similar utterances, questions or facts. Large Language Models like co:here has significant prowess in clustering user utterances. Read more about these approaches to search here.

For clustering semantically similar utterances or data in an unsupervised fashion for intent detection and general data exploration, HumanFirst is at the forefront.


Chatbots is currently in the trough of disillusionment, and two years away from the plateau. How will the technology catapult itself into the plateau of productivity within the next 16 months? This seems highly unlikely.

There are a few strategies emerging to fast-track chatbots and make their existence vital to customer care operations. Fast-tracking include bootstrapping a chatbot with search or QnA. Another approach to bootstrapping a chatbot is Large Language Models.

Another way to relevance for chatbots is automating customer care calls via a voicebot.

The route for chatbots out of this trough of disillusionment will most probably be by leveraging some of the existing Conversational AI technologies…

Chatbot development frameworks include other AI technologies in the hype cycle. In most cases chatbot frameworks are flexible and other technologies can be included the support the framework.

For instance, a chatbot solution can include semantic search, a NLP layer, AI Orchestration in the case of OneReach AI and Generative AI in the case of Co:here, OpenAI and the like..


According to Gartner, Natural Language Processing is five to ten years away from the plateau of productivity. Considering that NLP is so far away from the plateau of productivity, it is really an aberration when it comes to time to productivity compared to the other language technologies.

I need to ask myself the question, what is the definition of NLP, and what does it encompass? As a supporting technology, NLP can play a significant role in chatbot frameworks, but is LLM’s superseding NLP to some degree?

I can imagine there are challenges in implementing NLP successfully and effectively as part of business processes. And realising true business value in passing emails, voice calls and other internal and external communications through a NLP layer.

AI Orchestration

AI Orchestration finds itself the furthest back in the hype cycle.

Orchestration is an aspect in which a framework like OneReach AI specialises. After looking at their environment, I can see how it can be used as an orchestration layer for AI workflows.

Generative AI

Generative AI is second last in the hype cycle. The only generative systems I have prototyped with was Co:here, OpenAI and AI21labs. Generative AI is not only useful for generating bot responses, but also for generating a chatbot based on few shot training which can maintain state and context.

Generative AI can also be implemented to summarise text, extract key words, etc. Although Generative AI is at the inception of the hype cycle, certain elements can already be implemented selectively.

Organisations can apply generative AI that creates original media content, synthetic data and models of physical objects.

~ Gartner


It is encouraging that 15% of the AI technologies on the Gartner AI Hype Cycle is Conversational AI and Language Technology related.

And many of the other AI disciplines can play a supporting role in Conversational AI.



Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI.