Flux Is An Open-Source Tool For LLM Prompt Completion, Exploration & Mapping
Flux is described as a power tool, providing the ability to interface with expansive LLM prompts.
Flux can be considered as a natural progression from what is currently a given for LLM playgrounds. Instead of having a single and linear approach to prompts, a parallel and dialog tree approach is taken.
With a few additions, Flux can become a very useful no-code prompt creating and chaining too…below I explain how.
There are two ways of accessing Flux, you can access Flux via a hosted instance, or you can install it locally:
git clone https://github.com/paradigmxyz/flux.git
npm install
npm run dev
Flux is open source and according the Flux, the tree structure allows for:
- A wider variety of creative responses
- Test out different prompts with the same shared context
- Use inconsistencies to identify where the model is uncertain
Let me start with the good, followed by what improvements can be made.
The Good
Flux is a web-based graphic interface for creating prompt flows. It works well as a more expansive LLM playground, considering a playground is very linear, Flux’s canvas approach allows for a more expansive and non-linear/parallel approach, as seen below:
Conversation context is managed, via a few-shot learning approach, it is evident that previous dialog turns/prompts are submitted together with a new user input.
Flux works well for experimentation with prompts, archiving and comparing different outputs.
Areas Of Improvement
LLM Interface
Flux has a connector to OpenAI only, referencing the models gpt-3.5-turbo
, gpt-4
and gpt-4–32k
, as seen below. More parameters and connectors to more LLMs will make sense. Even just having access to a wider range of OpenAI models will help.
Tabs
Multiple trees can be created on a single canvas. Frameworks like Flowise and LangFlow make use of tabs to segregate different flows or applications. It would make sense for the Flux UI to be able to create and name tabs for different flows.
Templating
A natural progression will be prompt templates, so users can create place-holders/variables within a prompt template and set variables. This will lead to a more prompt chaining approach.
Prompt Chaining
Flux does not perform prompt chaining where prompt are chained to each other. There are no shared variables; Flux is a graphic representation of a few-shot conversational prompt. Hence a prompt which has dialog history imbedded in it.
Path Indicator
As seen below, when you click on a node, the right-hand pane is populated with the prompt path or dialog turns, if you like. Via the pane on the right, you can scroll up all the way to the top. The prompt path is given up until the blue system prompt node.
It would be a great addition of the node path is highlighted within the flow/tree all the way up to the very first blue system node.
As seen below, navigation is available from the main menu, but as I mentioned, a level of automation will be stellar.
ChatML Export
The prompts are segmented according to System, user and model. If a leg of the tree could be selected and exported in ChatML notation, implementation of the prompt will be simplified significantly. The prompts are then in an interchangeable format for further development.
⭐️ Please follow me on LinkedIn for updates on LLMs ⭐️
I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.