How To Create A GPT-3 Chatbot In 12 Lines Of Code

Using AI To Create Conversational AI

Cobus Greyling

--

Introduction

There is a certain allure to using AI to create AI, and more specifically a Conversational Agent.

The premise is that you describe your chatbot in natural language with regards to personality, domain, area of interest and tone. And in an instant, you have a chatbot which is resilient and knowledgeable.

This example of the GPT-3 playground shows how the chatbot is described in one sentence.

All the elements of intents, entities, state management and dialog are baked into one monolith of a chatbot.

Obviously fine-tuning comes to mind. In the form of a serious concern. Which I tried to address here.

The advent of GPT-3 did spark a discussion of note on low-code implementations for chatbots.

Low-code interfaces are made available via a single or a collection of tools which are very graphic in nature; and initially intuitive to use. Thus delivering the guise of rapid onboarding and speeding up the process of delivery solutions to production.

As with many approaches of this nature, initially it seems like a very good idea. However, as functionality, complexity and scaling start playing a role, huge impediments are encountered.

The Code

Forgetting for a moment about all the impediments when it comes to a low-code implementation, and what is required for large scale corporate implementations in terms if fine-tuning and the rest…

Let’s just appreciated how quickly a general conversational AI assistant can be created. Some of the elements which can be overlooked, which are really astounding are:

  • The absolute absence of fallback proliferation. The bot stays on its feet, so to ay. And does not default to a specific fallback when intent cannot be discovered.
  • Conversational context is maintained, intents and entities are detected intuitively.
  • Digression and to a degree disambiguation are present.
  • The natural language generation (NLG) is remarkable.

Below is the simplest code example possible in GPT-3; in Python.

The prompt portion is where the bot is described and a few dialog turns are made available.

There are some settings available under:

  • temperature
  • max tokens
  • frequency and presence penalty
  • Dialog turn indicators are also defined by stop.
pip install openai
import openai
openai.api_key = "###################"response = openai.Completion.create(
engine="davinci",
prompt="The following is a conversation with an AI assistant. The assistant is helpful, creative, clever, and very friendly.\n\nHuman: Hello, who are you?\nAI: I am an AI created by OpenAI. How can I help you today?\nHuman: What is RAM?\nAI:",
temperature=0.9,
max_tokens=150,
top_p=1,
frequency_penalty=0.0,
presence_penalty=0.6,
stop=["\n", " Human:", " AI:"]
)
print (response)

The JSON response is clear and a few other metrics are included.

{
"choices": [
{
"finish_reason": "stop",
"index": 0,
"logprobs": null,
"text": " RAM is a computer storage space used to temporarily store data while the computer is processing it."
}
],
"created": 162374566348,
"id": "cmpl-xxxxxxxxxxxxxxxxxxxxxxxxxxx",
"model": "davinci:2020-05-03",
"object": "text_completion"
}

And that is about as simple as it can be from a code perspective.

Some of the settings you can tweak in your chatbot.

The temperature and top_p settings control how deterministic the API is in generating a response.

If you’re asking the API to provide you with a response where there’s only one right answer, then you’d want to set these lower.

If you’re looking for a response that’s not obvious, then you might want to set them higher.

The number one mistake people use with these settings is assuming that they’re “cleverness” or “creativity” controls.

Conclusion

As mentioned before, there are definitely good implementation opportunities for the Conversational AI aspect of GPT-3.

Restaurant review based on a name and key words.

As a support API where text can be processed to assist existing NLU functionality, there is a very real use case.

As mentioned, GPT-3 can be a great help in pre-processing user input as a help for the NLU engine. The challenge is that GPT-3 seems very well positioned to write reviews, compile questions and have a general conversation. This could lead to a proliferation of bots writing reviews, online adds and general copywriting tasks.

An apple pie review based on four generic words.

This automation does not need to be malicious, in principle.

Open AI is seemingly making every effort to ensure the responsible use of the API’s.

The fact the extensive training is not required, and a few key words or phrases can point the API in the right direction, is astounding.

There are however opensource alternatives for most of the functionality available.

Pros

  • GPT-3 has quite a bit of functionality which can serve to augment a current chatbot.
  • Dialog can be diversified with the NLG capability.
  • General chit-chat can easily be created.
  • Copywriting is made easy for slogans, headlines, reviews etc.
  • Text transformation
  • Text generation
  • Creating a general purpose bot to chat to.
  • With their underlying processing power and data, creating flexible Machine Learning stories should be a good fit.

Cons

  • The API is cloud hosted
  • Cost
  • Social media bot content generation
  • Not a framework for sustainable chatbot scaling; yet.
  • Possible over and under steering with training data.
  • Fine-tuning

--

--

Cobus Greyling
Cobus Greyling

Written by Cobus Greyling

I’m passionate about exploring the intersection of AI & language. www.cobusgreyling.com

Responses (1)