Practical Chatbot Use Cases For GPT-3

Here Are Seven Functional Examples…

Cobus Greyling

--

Introduction

I thought to list seven practical and very possible implementations of GPT-3 in an existing chatbot.

There is so much debate on GPT-3, and the discussion is normally very polarized.

Is GPT-3 a chatbot development framework? No. Does it allow you to define a NLU pipeline? No. Does it allow for enterprise grade fine-tuning in terms of intents, entities, forms, dialog state management and NLG; no.

But GPT-3 does have quite a few auxiliary implementation use-cases for existing chatbots. Also, GPT-3 is pushing the boundaries on intent deprecation, Natural Language Generation (NLG), low-code implementations and deprecation of the dialog state machine.

However, for sure GPT-3 is very experimental and extremely fun.

1. Grammar Correction

There are existing grammar and spelling correcting API’s. But, often these mechanisms do not suffice.

Correct Grammar of a sentence

In cases where an intent and entities cannot be detected, the user utterance can be run through the Grammar correction API. As you can see from the examples above, the sentences provided are corrected to a large degree.

2. Text Summarization

Text summarization is a handy auxiliary tool for a few reasons… The first being for shortening user input. The other option is to shorten replies, especially where a knowledge base is queried and the response needs to be shortened to a specific length depending on the medium.

A paragraph is summarized into one sentence

Another use of text summarization is to present a user with a auto summarized dialog, with a read more option, which can then expand into the longer un-summarized version.

3. Keywords

Keywords can be extracted from a block of text. You can configure the environment to be conservative and select only keywords from the text. Or a higher temperature can be set to where related words or keywords are generated.

Key words generated from a Wikipedia paragraph.

This is very helpful to categorize text and create a search index. In the image above a extract on soccer was taken from Wikipedia. GPT-3 converted this quite large paragraph into six key words or themes.

4. Parse Unstructured Data

Create tables from long form text by specifying a structure and supplying some examples.

Here you can see the first entry is directly related to the sentence. The subsequent entries are somehow related and still relevant and applicable.

5. Classification

Classify items into categories via example inputs. Companies are named with categories defined. A new company can be mentioned and auto classified.

With limited training data a new company can be mentioned and auto classified.

6. Extract Contact Information

Extract contact information from a block of text. In this case, an address.

A complete address from the free text message.

7. Summarize For A Second Grader

This functionality takes a complex and relatively long piece, summarize and simplifies it into a sentence or two.

A large and complex piece of text is summarized and simplified.

Conclusion

There are definitely good implementation opportunities for the Conversational AI aspect of GPT-3.

Restaurant review is created from a few key words and the restaurant name.

As a support API where text can be processed to assist existing NLU functionality, there is a very real use case.

As mentioned, GPT-3 can be a great help in pre-processing user input as a help for the NLU engine. The challenge is that GPT-3 seems very well positioned to write reviews, compile questions and have a general conversation. This could lead to a proliferation of bots writing reviews, online adds and general copywriting tasks.

An apple pie review based on four generic words.

This automation does not need to be malicious in principle. Open AI is seemingly making every effort to ensure the responsible use of the API’s.

The fact the extensive training is not required, and a few key words or phrases can point the API in the right direction, is astounding.

There are however opensource alternatives for most of the functionality available.

Positives

  • GPT-3 has quite a bit of functionality which can serve to augment a current chatbot.
  • Dialog can be diversified with the NLG capability.
  • General chit-chat can easily be created.
  • Copywriting is made easy for slogans, headlines, reviews etc.
  • Text transformation
  • Text generation
  • Creating a general purpose bot to chat to.
  • With their underlying processing power and data, creating flexible Machine Learning stories should be a good fit.

Not-so Positives

  • The API is cloud hosted
  • Cost
  • Social media bot content generation
  • Not a framework for sustainable chatbot scaling; yet.
  • Possible over and under steering with training data.

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

No responses yet