What Are OpenAI Assistant Function Tools Exactly?

And how does a function tool look in its simplest form? In this article you will find some background information, and complete Python notebook with a simple working example.

Cobus Greyling
5 min readNov 10, 2023

--

Introduction

Large Language Models (LLMs) rely on unstructured, conversational data for input and the output from LLMs is also unstructured and conversational.

Functions Tools is a way to structure and format the output from the LLM, usually in a format which is ready to submit to an external API.

The name can be misleading and some assume that the LLM actually runs code or perform an integration to call a function. Actually, the LLM is populating parameters in a predefined JSON document by matching entities and phrases from the user input to a pre-defined JSON schema.

The structured JSON output can be used for anything, it can be used merely to structure data, or for data storage, etc.

More On Functions

Function calling allows you to describe custom functions to the assistant, for use in accessing external APIs. This should make it easier for you to call external functions by generating output in a JSON format which is expected by an API. And the JSON being pre-populated with the relevant input arguments.

Function calling is an avenue to structure LLM output, users can define schemas for “functions”, the LLM will select a schema and populate the items of that schema. Hence the Function Tool only attempts to prepare the data in a format the API expects it.

Functions have been part of the Chat Completion API and now Assistants also support the functionality.

Below is the simplest application, you can copy this Python code and paste it into a notebook and just run it. All you will require is to add your own OpenAI API Key.

The user input to the Assistant is: Send Cobus from Kore AI an email asking for the monthly report?

And below is the JSON document generated, complete with line breaks in the body of the mail.

send_email({
"to_address":"cobus@kore.ai",
"subject":"Request for Monthly Report",
"body":"Dear Cobus,\n\n
I hope this message finds you well. I am reaching out to kindly
request the monthly report. Could you please provide the latest
update at your earliest convenience?\n\nThank you in advance for
your assistance.\n\n
Best regards,"
})

In the code under tools, a tool type of function is defined with three properties, to_address , subject and body.

pip install --upgrade openai
########################################
import os
import openai
import requests
import json
from openai import OpenAI
########################################
api_key = "your openai api key goes here"
########################################
client = OpenAI(api_key=api_key)
########################################
assistant = client.beta.assistants.create(
instructions="You are a HR bot, answering HR questions.",
model="gpt-4-1106-preview",
tools=[{
"type": "function",
"function": {
"name": "send_email",
"description": "Please send an email.",
"parameters": {
"type": "object",
"properties": {
"to_address": {
"type": "string",
"description": "To address for email"
},
"subject": {
"type": "string",
"description": "subject of the email"
},
"body": {
"type": "string",
"description": "Body of the email"
}
}
}
}
}]
)
########################################
thread = client.beta.threads.create()
########################################
message = client.beta.threads.messages.create(
thread_id=thread.id,
role="user",
content="Send Cobus from Kore AI an email asking for the monthly report?"
)
########################################
run = client.beta.threads.runs.create(
thread_id=thread.id,
assistant_id=assistant.id,
instructions="Use the function tool for this query."
)
########################################
run = client.beta.threads.runs.retrieve(
thread_id=thread.id,
run_id=run.id
)
########################################
messages = client.beta.threads.messages.list(
thread_id=thread.id
)
########################################

Once the code has executed, you can head to your OpenAI dashboard, in there you will see that your Assistant has been created, and the tool send_email has been created under Functions.

The name and the instructions of the Assistant can now be set here.

When interacting with the Assistant in the dashboard, the JSON document is generated in the response and the logs can also be examined here.

In Closing

Functions is a step in the right direction, where the output format of data from the LLM can be defined and a structure or framework is supplied whereby the LLM can perform a type of data transformation.

⭐️ Follow me on LinkedIn for updates on Large Language Models ⭐️

I’m currently the Chief Evangelist @ Kore AI. I explore & write about all things at the intersection of AI & language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.

LinkedIn

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com