Flowise For 🦜🔗LangChain

Flowise is an open source Graphic User Interface to build LLM based applications on LangChain.

Cobus Greyling
4 min readMay 17, 2023

--

Introduction

Flowise is an open source project which will always be free for commercial and personal use.

It is based on 🦜️🔗 LangChain.js and is a very advanced Graphic User Interface for developing LLM based applications.

These applications are also known as Gen Apps, LLM Apps, Prompt Chaining, LLM Chains, etc.

LLM Apps

Various building blocks are emerging in the LLM application building ecosystem.

These include Prompt Engineering, Agents, Chaining, Semantic Search, Chat Models, Vector Stores and various tools which can be assigned to an Agent to action.

There are new methods emerging of building flexible conversational interfaces.

With LLM-based chat flows, the conversation design and build process is less granular and much of the detail and minutia of exception handling, chit-chat (small talk), repair paths and more, can be offloaded to the LLM’s resilience.

Even though Flowise is free to use, cost considerations need to be given to hosting. Add to this all the third party API calls, cost will be incurred.

These costs can escalate quite quickly based on user volumes and to what extend these systems are leveraged.

Another consideration will be latency, accessing geographically disparate systems, and more.

Back To Flowise

Flowise is such an intuitive LLM App development framework. Even though I am in early stages of prototyping with Flowise, I do get the sense that Flowise is a much more rounded and complete development UI than LangFlow.

Below you see the dashboard with API keys, Marketplaces and Chatflows. Notice how the development components constituting the chat flow is listed.

There is a marketplace available with quite a number of pre-configured applications which serves to fast track users to the point of having a working prototype.

The application below is a basic weather bot, and probably the simplest LLM Chain which can written, with only three components. OpenAI connector, Prompt Template and the LLM chain connecting the two.

The application below shows a basic translation bot and it also illustrates how prompt values can be formatted.

Below is a general knowledge conversational chatbot which leverages OpenAI, the Serp API and holds conversational memory.

Below is the conversation I had with the chatbot, notice how conversational context is maintained throughout the conversation.


AI : Hi there! How can I help?
Me : hi
AI : Hello! How can I assist you today?
Me : When was the last Formula One grand prix in South Africa?
AI : The last Formula One Grand Prix in South Africa was held in 1993 at the Kyalami Circuit.
Me : Who won that race?
AI : The winner of the 1993 South African Grand Prix was Alain Prost.
Me : Who was his biggest rival?
AI : Alain Prost's biggest rival during his Formula One career was Ayrton Senna.
Me : Is he still alive?
AI : Yes, Alain Prost is still alive.
Me : And Ayrton Senna?
AI : Ayrton Senna passed away on May 1, 1994 following a crash during the San Marino Grand Prix.

What I find exciting is how the development environment is unfolding and how different design and development affordances are taking shape.

Makers will have to explore how to translate their requirements into the different development components available.

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language. Including NLU design, evaluation & optimisation. Data-centric prompt tuning and LLM observability, supervision and fine-tuning.

https://www.linkedin.com/in/cobusgreyling
https://www.linkedin.com/in/cobusgreyling

--

--

Cobus Greyling

I explore and write about all things at the intersection of AI & language; LLMs/NLP/NLU, Chat/Voicebots, CCAI. www.cobusgreyling.com