Photo by Nathan Duck on Unsplash

How To Build A Chabot with GPT-3

And What Tools Are At Your Disposal…

Cobus Greyling

--

Introduction

There is something immensely impressive about creating a conversational agent, aka chatbot, with only a few lines of code via Python or CURL. And, the only training data is a sentence or two on the persona and tone of the chatbot, and nothing else.

While chatting with OpenAI’s chatbot, there is no fallback proliferation, and you can digress, and the conversation does not get stuck on ambiguity. Also, for the novice there is a sense of achievement by the thought that they created a chatbot. The Natural Language Generation (NLG) on par with a human conversation; there is not that unfamiliar feel usually associated with NLG.

Training Data: “The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.” Intent-less and natural language generation.

Since OpenAI made text processing available with GPT-3, the question have been asked, is this the ultimate and only interface you will ever need to create a Conversational AI interface, or chatbot?

Why still bother with other frameworks and environments?

Yes, there are instances where GPT-3 can be used in a standalone mode. For instance; a mood-to-color chatbot, a fun and sarcastic chatbot, a friend like chit-chat bot etc.

However, currently, GPT-3 has minimal capacity for supporting fine-tuning projects.

GPT-3 is working on building a self-serve fine-tuning endpoint that will make this feature accessible to all users, but concrete timelines are not available.

For now, in most production and enterprise implementations, GPT-3 will play a support role…but let’s take a look to what extend the OpenAI API can be trained and adapted with custom information.

Basic Conversational GPT-3 Chatbot

This interface accepts an input prompt from the user, and the model returns one or more predicted completions or responses. Added to this, a possible follow-up question is also presented. This possible follow-up question can be presented to the user as, you might also want to know about…

Human: Human: What is a compiler?AI: A compiler is a computer program that translates source code written in a programming language into another computer language.
Human: What is the purpose of a compiler?

Given a prompt, the model will return one or more predicted completions, and can also return the probabilities of alternative tokens at each position.

The is an extract from the OpenAI playground, where a single line of training data is given. Each dialog turn is denoted with Human and AI.

The image above is an extract from the playground with the bot description shown, and the conversation which I had with the bot. The persona of the bot can also be described in detail, for instance:

Eric is a chatbot that reluctantly answers questions.”

An example of the most simple Python configuration running in a notebook.

For a more practical and technical example, the AI assistant is shown here in a notebook written in Python. You will have to run the command:

pip install openai

From here you can start your Python code with:

import os
import openai

As you can see, there are 17 lines of code making up the chatbot.

Positives

There are a few really special capabilities on display here which can easily be missed.

There are a few elements where GPT-3 is forward looking. These include the use of NLG. They also employ an end-to-end training implementation, also referred to as intent deprecation or intentless implementation. Entities are also managed seamless.

  • The Natural Language General (NLG) is stellar; Often NLG is has got an unfamiliar feel to it. Not with GPT-3. No wonder the API excel in copy writing, summarization and other text processing tasks.
  • There is no fallback proliferation, the chatbot interface stays on its feet and responds with diverse answers to move the dialog forward.
  • There are no impediments to digression.
  • Seemingly the context is maintained quite well.
  • Ambiguous input is also handled good.

Not So Positives

GPT-3 is not a chatbot development framework. Fine tuning does not currently exist for elements like:

  • Policies
  • Slot Filling / Forms
  • Dialog Management
  • Larger sets of training data
  • Intent & Entity extracting

Due to the avant-garde nature of GPT-3, it will be interesting to see how OpenAI will introduce a measure of dialog management, or finetuning the chatbot with training data etc.

As mentioned in a previous article, GPT-3 has a few API’s which are well suited in augmenting and supporting an existing chatbot implementation. The GPT-3 conversational API is ideal for a general conversation, Q&A, friendship bots etc.

But, it fails to meet the requirements of a domain specific, enterprise chatbot’s requirements.

Customizing The GPT-3 Chatbot

There are some measure of customization which is possible via the API, but there are limitations. Here are some of the functionality available to use your own data.

These options might serve as some insight as to what lies ahead for GPT-3.

Create Classification

Within the Python code, you can see the query:

query="It seems like I first need to go slower and make sure :(",

With the query, labeled examples are provided:

examples=[
["You can drive cross the intersection.", "Green"],
["Go slower to check the intersection", "Orange"],
["You must stop at the intersection..", "Red"]
],

Given the query and the set of labeled examples, the model will predict the most likely label for the query. This can serve to classify text, or label sentences.

Classification examples are given. In true GPT-3 fashion, only a few lines of “training” data. The user input is labeled, or classified according to the data.

What is interesting is that the example data is included in the query. I initially thought the same approach would have been taken as IBM Watson. Where you submit the training data, and later reference it from the query with an ID.

Create Answers To Questions

Answers the specified question using the provided documents and examples.

The API first search uploaded documents or files to find relevant context. The relevant context is combined with the provided examples and question, to create the prompt for completion.

The question with the example documents. In this case the documents are used to answer the question.

A second query based on the same input.

Another question is posed and the answer is retrieved from the examples supplied.

Create Search

The search endpoint computes similarity scores between provided query and documents. Documents can be passed directly to the API.

Three continents are listed and a City. The continent to which the city belongs is returned.

Three continents are listed and a City. The continent to which the city belongs is returned.

Conclusion

There are elements of GPT-3 which is mind-blowing, the text related tasks are super useful; summarization, simplification, theming, key word extraction etc.

Talking to the chatbot API is surreal. The NLG and contextual awareness are astounding. It is only once you start thinking of building a domain specific enterprise solution, and scaling, and abstraction, where the challenge start.

--

--