Flowise Now Has Custom Tools With OpenAI Function Calling

In the latest Flowise version, Custom Tools are introduced together with OpenAI Function Calling. In this article I cover a few practical implementations.

Cobus Greyling
5 min readJun 26, 2023

--

I’m currently the Chief Evangelist @ HumanFirst. I explore & write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.

The latest release of Flowise accommodates OpenAI Function Calling, whereby the LLM returns structured data which is executed via Custom Tools.

The OpenAI LLM decides when to use the tool to structure data to be used in an API call.

Here is a practical step-by-step working example:

For this example we want to call the OpenWeather API and have the OpenAI function calling feature process the user input and format it into the right format.

Below is the weather API we will be calling by making use of OpenAI function calling:

https://api.open-meteo.com/v1/forecast?latitude=52.52&longitude=13.41&current_weather=true

Considering the image below, on the left-hand side of the FlowiseAI IDE, there is a new menu option called Tools.

From here, a new tool can be created by clicking on the blue Create New button.

As seen below, there is an existing tool call Get_Weather.

Considering the image below, when a new tool is created, the tool can be named [1], and a description [2] must be given for the tool.

The tool description will help the Agent to understand which tool to select for each user request.

The parameters can then be defined via the IDE [3]. OpenAI Function Calling will take the user input and assign it to the different parameters listed in the output schema.

In the example [3] two parameters or variables are defined, Latitude and Longitude.

The type of number is defined, the description is used by OpenAI to know which entity from the user input, to be associated with which variable.

Hence Flowise facilitates the translation of the user input into a JSON document, and into this graphic output schema.

When scrolling down in the tools window, the Edit Tool is visible. Here Javascript Code can be entered. This is the function/Javascript code to execute when the tool is being used. The properties specified in the output schema can be used directly as variables here.

In our example, Latitude and longitude are defined in the output schema. These values are accessible in the Javascript just by referencing the variables $Latitude and $Longitude.

From the Marketplaces tab, the OpenAI Agent can be selected, this is an agent that uses OpenAI’s Function Calling functionality to pick the tool and variables to call.

For simplicity I deleted the Serper component from the flow.

As seen below, The Custom Tool Component has a dropdown from where the newly defined tool Get_Weather can be selected from the dropdown. The dropdown is populated with the names of all the defined tools within the IDE.

Below is a conversation with the agent, there are a few interesting points to note.

Firstly, the ease of integration between the conversational/LLM layer and the integration/API layer is astounding. The process of structuring unstructured conversational data and performing entity extraction is completely automated.

Secondly, even-though the API response is cryptic, the API response is transformed from structured data into unstructured conversational data.

Thirdly, the follow-up questions can be asked based on the response of the API.

The user question is: What is the weather currently at latitude 52.52 and longitude 13.41? .

The API response is:

{"latitude":52.52,
"longitude":13.419998,
"generationtime_ms":0.225067138671875,
"utc_offset_seconds":0,
"timezone":"GMT",
"timezone_abbreviation":"GMT",
"elevation":38.0,
"current_weather":{"temperature":22.7,
"windspeed":4.7,
"winddirection":86.0,
"weathercode":1,"is_day":0,
"time":"2023-06-25T20:00"}}

And yet the chatbot conversational response is:

The current weather at latitude 52.52 and longitude 13.41 is 25.5°C with a windspeed of 8.1 m/s coming from the south (167°). It is currently daytime.

A follow-up question is asked: What is the elevation at latitude 52.52 and longitude 13.41?

With the chatbot response: The elevation at latitude 52.52 and longitude 13.41 is 38.0 meters.

This information is imbedded within the API response, and follow-up questions are answered by the Agent implicitly from the API response.

A follow-up question is then asked which is related to the API response, but the information is not imbedded in the API response :What big city is located there?

The response of the Agent is astoundingly accurate: The latitude 52.52 and longitude 13.41 corresponds to the city of Berlin, Germany.

However, here it is clear that the line between API-related and LLM-related responses are blurred.

Finally, the way in which tools can be segregated and design within Flowise, makes for easy creation and management of tools. Subsequently tools can be used and reused within multiple chat flows.

⭐️ Follow me on LinkedIn for updates on Conversational AI ⭐️

I’m currently the Chief Evangelist @ HumanFirst. I explore & write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.

LinkedIn

--

--