Create A Q&A Chatbot Using GPT-3

Provide Search Documents Or Use GPT-3 Default Search

Cobus Greyling
6 min readJun 1, 2021

--

Introduction

Amongst others, there has been two general notions within the chatbot framework ecosystem.

The first is the deprecation of intents, which is front and center when examining GPT-3.

There are four emerging approaches to the deprecation of intents.

In the general Q&A chatbot, not using any training data, you can see how GPT-3 fields questions. Notice how context is maintained vertically, and how users can ask questions referencing earlier context in the conversation.

The second is the deprecation of the state machine. This is necessary to introduce a more flexible conversational flow.

One of the avenues to introduce more flexibility to a state machine driven dialog management environment; where all conversational paths and responses are pre-define is…

By introducing a feature where, if there is no intent detected with a high confidence, the dialog can default to search a knowledge-base and respond with the result. Hopefully in a conversational manner.

This is not something unique to a specific chatbot framework. NVIDIA Jarvis, which was released recently has integration examples to Wikipedia to serve as a knowledge base which can be searched. Other platforms like MindMeld, Rasa, Microsoft, IBM and more make provision for such functionality.

Obviously these systems vary in complexity and implementation complexity.

In this story we are taking a look at the ease with which GPT-3 can perform this task.

General Search With GPT-3

GPT-3 has a very powerful and simple interface for general question and answer conversations. More than that, the context of the conversation is managed exceptionally well.

As per the example above, the questions is asked: “Who won the F1 title in 2000?

Subsequently the user can ask, “For which team did he drive?”.

And “who was his team member?”.

The context of the conversation is maintained, making the Q&A interface much more conversational than other Q&A systems. In general most Q&A solutions are focused on a single dialog turn conversation, retrieving a single dialog with relevant information.

This is not the case with GPT-3.

Below, aGPT-3 Search chatbot written in Python using 14 lines of code. Line 8 contains a string assigned to the variable called prompt, acts as the training data.

A GPT-3 Search chatbot written in Python in 14 lines of code. Line 8 with a string assigned to the variable called prompt acts as the training data.

Below is the question posed to GPT-3.

Who won the F1 title in 2011?

This is the string of training data used for the chatbot. With “Q” denoting the questions and “A” the answer.

prompt="I am a highly intelligent question answering bot. If you ask me a question that is rooted in truth, I will give you the answer. If you ask me a question that is nonsense, trickery, or has no clear answer, I will respond with \"Unknown\".\n\nQ: What is human life expectancy in the United States?\nA: Human life expectancy in the United States is 78 years.\n\nQ: Who was president of the United States in 1955?\nA: Dwight D. Eisenhower was president of the United States in 1955.\n\nQ: Which party did he belong to?\nA: He belonged to the Republican Party.\n\nQ: What is the square root of banana?\nA: Unknown\n\nQ: How does a telescope work?\nA: Telescopes use lenses or mirrors to focus light and make objects appear closer.\n\nQ: Where were the 1992 Olympics held?\nA: The 1992 Olympics were held in Barcelona, Spain.\n\nQ: How many squigs are in a bonk?\nA: Unknown\n\nQ: Who won the F1 title in 2011?\nA:"

The Good

  • The Natural Language General (NLG) is stellar; Often NLG is has got an unfamiliar feel to it. Not with GPT-3. No wonder the API excel in copy writing, summarization and other text processing tasks.
  • There is no fallback proliferation, the chatbot interface stays on its feet and responds with diverse answers to move the dialog forward.
  • There are no impediments to digression.
  • Seemingly the context is maintained quite well.
  • Ambiguous input is also handled good.
  • Training data or guidance is minimal.
  • Dialog management is performed automatically.
  • Conversational context and context specific to undefined entities in the conversation is remarkable.

The Not-So Good

  • There might be concern regarding brand damage with corporates with regards to NLG. Some form of dialog curation might be required.
  • Finetuning is not granular enough for enterprise implementations, yet.
  • The API is cloud hosted
  • Cost
  • Larger sets of training data
  • Intent & Entity extracting

Custom Search Using Provided Documents

In the example below we are going to follow a two-step process where we are going to upload a data file and then have search results yielded based on this file.

Our search file or document only has two lines, where we have a text entry, with meta data.

{"text": "puppy A is happy", "metadata": "emotional state of puppy A"}
{"text": "puppy B is sad", "metadata": "emotional state of puppy B"}

Next, we upload this file using the files API. After running pip install to install openai, you see the simplest possible way to upload search documents. Perhaps apart from cURL.

OpenAI is installed using pip. Python is used to mount a Google Drive and access the training file. The training process is initiated.

If something is wrong with your Python code, a 400 or 401 error will be returned. A successful return message is shown here below.

Most importantly the file ID is given, which can be used to reference the search data.

The result of the file uploading and training process.

Below, Python code to query the single word of “happy” while referencing our file ID.

Python code to query the single word of “happy” while referencing our file ID.

You will see the result, specific to the document uploaded.

Conclusion

This example illustrates how a conventional chatbot can be augmented by a body of searchable data. This can lessen the dependence on fallback intent, which invariably results in fallback proliferation.

Especially in cases where a conversational interface does not have a narrow domain approach and needs to accommodate a wider or even open domain in terms of user questions.

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com