Haystack Launched PromptHub
PromptHub illustrates how prompts can be used in various implementations.
I’m currently the Chief Evangelist @ HumanFirst. I explore & write about all things at the intersection of AI & language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces & more.
Haystack’s PromptHub is a good resource for prompt engineering in general where each of the 18 prompts listed can be explored via a GUI.
As seen below, the prompt is displayed with a description on how the prompt can be implemented. And how the prompt can be used in the Haystack environment.
Listed below, the categories of Generative Apps covered in PromptHub includes question answering, agents, conversational interfaces, language detection, sentiment, translation, classification and summarisation.
Here is one of the question answering prompt templates from PromptHub with prompt name: deepset/question-answering-per-document.
Given a set of documents
and a single query
, this prompt is designed to answer the query once per document.
Given the context please answer the question. Context: {documents};
Question: {query};
Answer:
And how to use it in Haystack.
import os
from haystack.nodes import AnswerParser, PromptNode, PromptTemplate
question_answering_per_doc = PromptTemplate("deepset/question-answering-per-document", output_parser=AnwerParser())
prompt_node = PromptNode(model_name_or_path="text-davinci-003", api_key=os.environ.get("OPENAI_API_KEY"))
prompt_node.prompt(prompt_template=question_answering_per_doc, query="YOUR_QUERY", documents="YOUR_documents")
PromptHub has instructions on how to use the prompts with PromptTemplate
and PromptNode
.
Prompts found in PromptHub are all maintained on GitHub, deepset-ai/prompthub
. Each prompt comes with a YAML
file, containing the prompt itself, and a prompt card with the same name. The prompt card is a markdown file detailing the intended use-case of the prompt and how to use it with a Haystack PromptNode
.
Copy Prompt Identifier
The easiest way to use prompts from the PromptHub is to simply copy over their identifier, and add this into the prompt
field in a PromptTemplate
.
For, instance, to make use of the prompt identifier deepset/zero-shot-react the line zero_shot_agent_template = PromptTemplate(“deepset/zero-shot-react”)
can be used in the following way:
import os
from haystack.agents import Agent
from haystack.nodes import PromptNode, PromptTemplate
zero_shot_agent_template = PromptTemplate("deepset/zero-shot-react")
prompt_node = PromptNode(model_name_or_path="text-davinci-003", api_key=os.environ.get("OPENAI_API_KEY"), stop_words=["Observation:"])
agent = Agent(prompt_node=prompt_node, prompt_template=zero_shot_agent_template)
agent.run("Your query")
Use Local Prompts
PromptNode can use a PromptTemplate that contains the prompt for the model.
from haystack.nodes import PromptNode, PromptTemplate
from haystack import Document
# Initalize the node
prompt_node = PromptNode()
# Specify the template using the `prompt` method
# and pass your documents and questions:
prompt_node.prompt(prompt_template="deepset/question-answering",
documents=[Document("Berlin is the capital of Germany."), Document("Paris is the capital of France.")],
query="What is the capital of Germany?")
# Here's the output:
[<Answer {'answer': 'Berlin', 'type': 'generative', 'score': None, 'context': None, 'offsets_in_document': None, 'offsets_in_context': None, 'document_ids': ['1a7644ef76698b7a1c6ed23c357fa598', 'f225a94f83349e8776d6fb89ebfb41b8'], 'meta': {'prompt': 'Given the context please answer the question. Context: Berlin is the capital of Germany. Paris is the capital of France.; Question: What is the capital of Germany?; Answer:'}}>]
The PromptHub as a resource is a good tool to prototype and experiment with different implementations of LLMs like ReAct, Agents, Tools and more.
⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️
I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.