An Overview Of The Stack AI Composer For Gen Apps

The Stack AI Composer is a production-ready, hosted IDE to build Large Language Model-based Generative Apps.

Cobus Greyling
5 min readMay 31, 2023

--

  • The Stack AI Composer reminds of the approach taken by LangFlow, Flowise and ChainForge.
  • Stack is not open-sourced and is a payed service running in the cloud. I guess it's safe to assume that local data centre installation will be available for enterprise customers.
  • Stack does have a multi-modal component, which can facilitate Foundation Model development.
  • Attention to detail within the IDE is immense, as you will read later.
  • An open-sourced IDE like Flowise or LangFlow will most probably expand more rapidly in terms of development components due to the community contributing.
  • Flows design in Stack is simplified, and it's quicker to achieve comparable results to open-sourced IDEs. Functionality like autonomous chain-of-thought reasoning, conversation context management etc. are all managed and executed under the hood.
  • Developing generative apps can be abstract, this has been a common comment on LangFlow and Flowise. Consider below how the input types are indicated and the output points. Hence my comment on attention to detail within Stack.
  • Again, there is a sense of refinement of the UI, as seen below. Input nodes are tagged with a description for each input point and descriptions for the output.
  • Within their flow creation IDE, Stack is focused on the following functionality:

Conversation Context Memory

Stack has an option to allow for conversation state management without the overhead of managing conversation history in a prompt.

All the LLM memory is encrypted end-to-end in the Stack AI database. This data can be self-hosted under the Stack AI enterprise plan. Considering the image below, even in a very simple implementation, contextual memory is stored and accessed quite well.

Agents

A glaring omission from the list of development components is that of Agents.

Agents are a concept which is generally known within LangChain and Haystack.

The idea is that of answering complex questions via chain-of-thought reasoning and decomposing difficult questions.

However, when posing a complex question like the one below, Stack recognises the complexity and decomposes the question for answering in an autonomous fashion.

What is the square root of the year of birth of the man generally 
considered as the farther of the iphone?

Notice below how the question is decomposed and answered in a chain-of-thought reasoning fashion. Each step is explained by the Stack interface.

Hence my statement earlier that Stack handles complexity under the hood, and surfaces a simplified experience to the user. Does Stack have an Agent node component; no. Does Stack incorporate Agent functionality in their components it an autonomous fashion? Yes!

Sure, let's answer your questions step by step:

1. The person generally considered as the father of the iPhone is
Steve Jobs.
2. Steve Jobs was born on February 24th, 1955.
3. To calculate the square root of his year of birth,
we simply take the square root of 1955, which is approximately 44.24.

Therefore, the square root of the year of birth of the
man generally considered as the father of the iPhone
(Steve Jobs) is approximately 44.24.

Attention To Detail

Within the Stack IDE immense attention has been given to detail. For instance:

Play: Within a completed flow, you can enter text in the input node and click on the play button. This executes the flow and the LLM output is displayed.

History: Versioning of the application is performed automatically with date/time stamps for easy rollback.

Evaluate: the LLM Prompt evaluation tool is an interesting approach to abstracting the LLM prompt input and output. This approach to abstraction reminds quite a bit of how message abstraction is handled within chatbots.

Obviously the flow can be exposed via an API in different formats. This allows for the creation of smart APIs, hence one could consider Stack as an API creator for advanced automation.

Stack APIs can be useful in any Generative App, Conversational UI, etc.

⭐️ Please follow me on LinkedIn for updates on LLMs ⭐️

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

https://www.linkedin.com/in/cobusgreyling
https://www.linkedin.com/in/cobusgreyling

--

--