Source

Azure Prompt Flow For LLMs

The focus of Azure Machine Learning Prompt Flow is the streamlining of development, evaluation, continuous integration & deployment of prompt engineering projects.

Cobus Greyling
5 min readMay 31, 2023

--

I’m currently the Chief Evangelist @ HumanFirst. I explore & write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.

Azure Prompt Flow is currently only available in private beta, after being vetted via an application process.

To some extent Azure Prompt Flow feels underwhelming, considering other GUIs for LLM development.

The objective of Prompt Flow is to assist with complex logic and control the flow by creating effective prompts.

Azure does state that Prompt Flow provides integration options for existing LangChain developments. This is an ambitious statement and it is unclear how Prompt Flow will seamlessly integrate with LangChain.

In the words of Azure:

This compatibility enables you to lift and shift your existing assets to prompt flow, facilitating Prompt Engineering, evaluation, and collaboration efforts to prepare your flow for production.

This smooth transition ensures that your previous work is not lost and can be further enhanced within the prompt flow environment for evaluation, optimisation and production.

The objective of Prompt Flow is to move users past the initial stages of ideation and experimentation, hence completing the proverbial last-mile to production ready LLM supported applications.

Prompt Flow wants to achieve this objective in two ways:

  1. Agile prompt engineering by introducing the elements of visualisation, evaluation, flows, etc.
  2. Prompt Flow is to act as a collaboration tool, facilitating deployments, monitoring, etc.

Recently I have experimented with IDEs like ChainForge, LangFlow and Flowise. And I have been immensely impressed with the capability of LLM-based autonomous agents.

Considering the image below:

  • I get the impression that the flow graphic on the top-right is generated based on notebook-like entries and is not interactive. It would be first-prize if users can drag, drop, link, etc prompt nodes via that design view.
  • The top right graphic does show prompt chaining, where prompts can be chained together to form a larger conversational interface.
  • There is a “inputs” node which acts as a decision point, but again…it would be stellar if the graphic can be expanded and used as a design & build canvas.
  • Bulk testing of prompts [2] is possible, and a chat [1] window can be launched as seen on the bottom right-hand corner [5].
  • Seemingly multiple LLMs can be added [3], prompts, Python code nodes and more.
  • Prompt templating is hosted at the bottom of the page [6], with a template language. LLM settings can be set per prompt and run.
Source

This window shows how prompt templates are created in a Zero Shot and Few Shot fashion. With LLM configuration settings like max_tokens, temperature, stop sequence, etc.

Source

Bulk testing of prompts and prompt chains are available, evaluation input mapping and evaluation flows.

Source

⭐️ Please follow me on LinkedIn for updates on LLMs ⭐️

Finally

Prompt Flow is focussed on creating and refining prompts, and chaining prompts with some level of imbedded logic.

The efforts to create an IDE environment for prompt management makes sense, but I expected a more advanced IDE with an expanded design canvas to build more elaborate flows.

In general, there are three levels where complexity can be added to LLM-based Generative Apps. Complexity can be accommodated in:

  1. Prompt Engineering
  2. Prompt Chaining
  3. Autonomous Agents

Considering frameworks like Haystack and LangChain, there is a bold move to Autonomous Agents with an array of tools at their disposal. In this setting, prompts are very generic in nature and compiled on the fly based on user requests.

Hence, Prompt Flow feels to a large extent underwhelming. Having said this, I do not have access to Prompt Flow for first-hand experience from building prototypes.

⭐️ Please follow me on LinkedIn for updates on LLMs ⭐️

I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.

https://www.linkedin.com/in/cobusgreyling
https://www.linkedin.com/in/cobusgreyling

--

--