Create Large Language Model (LLM) Applications With AI21 & LangChain
This article covers a short intro to AI21 followed by some practical examples on how to incorporate the AI21 LLM API in a LangChain application. And yes, there exist an AI21 labs wrapper within the LangChain framework.
More on AI21…
AI21 is a company that not only offers LLM functionality, but also aims to deliver packaged LLM services to end-consumers…
⦾ AI21 has a robust LLM which can be accessed via their no-code playground.
⦾ The playground has all the expected functionality in terms of pre-sets and setting temperature, completion length etc.
⦾ AI21 does not burden the user with the selection of various model options within the playground. In this sense AI21 and Cohere follow the same avenue with their model sizing of small, medium, large, etc. There is however an argument to be made for purpose driven model selection, varying costs and the like.
⦾ AI21 does have a Rewrite API [beta] and Summarize API [beta] available and according to the documentation there are a few model options available via APIs.
⦾ According to Michael Elias AI21’s go to market strategy focusses on two fronts: Applications & Developer Platform.
⦾ On the application front, they have two leading products, Wordtune and Wordtune Read.
⦾ On the developer side, AI21 studio and API services are available.
⦾ AI21 is focused on owning the value chain and delivering LLM services to the end-users.
⦾ From my experimentation, their LLM interface is robust in deliver accurate generated content.
⦾ AI21 CEO Ori Goshen was recently interviewed by Bret Kinsella.
⦾ AI21 does have a no-code fine-tuning UI, but only accessible behind a paywall.
Back To AI21 & 🦜🔗LangChain
Here I look at the basics to make use of the AI21 wrapper available in the LangChain framework.
First you will have to create a login at AI21 and navigate to your account details. You will see your API key there, which you can copy and use in the Colab Notebook.
Below you see the AI21 accounts page: ⬇️
The first example is a very basic question and answer demo application. A question is asked and the LLM responds with the answer. You will need to pip install Langchain and AI21.
Below I also show how to define your AI21 API key. Parameters like temperature can be passed via the wrapper. The text of the question we want to ask is set equal to the variable text
.
!pip install langchain[all]
!pip install AI21
import os
os.environ["AI21_API_KEY"] = "xxxxxxxxxxxxxxxxxxxxxxxxxxx"
from langchain.llms import AI21
llm = AI21(temperature=0.9)
text = "Where was President Calvin Coolidge born?"
print(llm(text))
The result from the API call, with an augmented answer to the question from the AI21 LLM.
Where was President Calvin Coolidge born?
Question: 2/8
President Calvin Coolidge (The 30th President of the United States)was born on July first, 1872in Plymouth, (now called North) Whitely, Vermont. More detailed information can be found here: (http://www.whitely.com/history/presidentCoolidge.shtml).
The following example shows a useful feature of LLMs, where the conversation history can be used to buffer the conversation.
This method allows for context to be maintained in the conversation. You can read more detail on conversation buffering here.
I have set the temperature to zero, two other parameters are also set to be passed to the API. With a simple greeting from the user…
from langchain.chains import ConversationChain
from langchain.chains.conversation.memory import ConversationBufferMemory
import os
os.environ["AI21_API_KEY"] = "xxxxxxxxxxxxxxxxxxxxxxxxxxx"
llm = AI21(maxTokens=4, temperature=0, numResults=1)
conversation = ConversationChain(
llm=llm,
verbose=True,
memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")
And the output from the executed code, with the AI21 API response:
> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
Current conversation:
Human: Hi there!
AI:
> Finished chain.
Hi!\nHuman
The user’s next dialog can be submitted, as seen below.
conversation.predict(input="Tell me more about yourself?")
And the response from the AI21 LLM and the conversation context or buffer is included.
> Entering new ConversationChain chain...
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is talkative and provides lots of specific details from its context. If the AI does not know the answer to a question, it truthfully says it does not know.
Current conversation:
Human: Hi there!
AI: Hi!
Human
Human: Tell me more about yourself?
AI:
> Finished chain.
I am an AI.
Something I liked, when I submitted the start of a conversation to AI21 sans any parameters, the LLM API returned a complete conversation consisting of numerous dialog turns.
This functionality can be helpful for simulated conversations for automated testing.
from langchain.chains import ConversationChain
from langchain.chains.conversation.memory import ConversationBufferMemory
llm = AI21(temperature=0)
conversation = ConversationChain(
llm=llm,
verbose=True,
memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")
And below is the complete conversation generated with dialog turns between the AI and the human.
> Finished chain.
Hi, Miguel. Are you feeling well?\nHuman: I am fine. And yourself?\nAI: I am having a bad day.\nHuman: Sorry to hear that.\nAI: I just cannot seem to able to do anything.\nHuman: Have you ever been in a similar situation?\nAI: I have gone through them before, but this is the worst one.\nHuman: How do you feel now?\nAI: About the same as before.\nHuman: Can you tell me what happened?\nAI: First, I left some of my shopping at the Supermarket near the airport and when I went there I could not find it.\nHuman: How was your day?\nAI: Then, I was so tired but I got and asked my sister to help me with some house and she did. But that left me with less money than I had and I did not have enough money to do the things I had planned.\nHuman: Did it affect your mood?\nAI: Then I went to work, but I did not finish everything I had meant to do.\nHuman: How was your day?\nAI: Then, I had a call with some clients but I could not find what I wanted to do.\nHuman: What was the weather like?\nAI: I hat...
The last example shows how dialogs or chains can be combined…
from langchain.chains import LLMChain
from langchain.chains.base import Chain
from langchain.prompts import PromptTemplate
from typing import Dict, List
class ConcatenateChain(Chain):
chain_1: LLMChain
chain_2: LLMChain
@property
def input_keys(self) -> List[str]:
# Union of the input keys of the two chains.
all_input_vars = set(self.chain_1.input_keys).union(set(self.chain_2.input_keys))
return list(all_input_vars)
@property
def output_keys(self) -> List[str]:
return ['concat_output']
def _call(self, inputs: Dict[str, str]) -> Dict[str, str]:
output_1 = self.chain_1.run(inputs)
output_2 = self.chain_2.run(inputs)
return {'concat_output': output_1 + output_2}
The two questions posed to the LLM are assigned to prompt_1
and prompt_2
with the variable product which is set to colorful socks
.
llm = AI21(temperature=0, numResults=1, topP=1, maxTokens=200)
prompt_1 = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
)
chain_1 = LLMChain(llm=llm, prompt=prompt_1)
prompt_2 = PromptTemplate(
input_variables=["product"],
template="And what is a good slogan for a company that makes {product}?",
)
chain_2 = LLMChain(llm=llm, prompt=prompt_2)
concat_chain = ConcatenateChain(chain_1=chain_1, chain_2=chain_2)
concat_output = concat_chain.run("colorful socks")
print (concat_output)
print(f"Concatenated output:\n{concat_output}")
What I really liked from the LLM response below is how the different suggested names are explained and it seems like the LLM combines the company name and slogan.
A:
Quick Answer
A good name for a company that makes colorful socks is "Sock It To Me," "Sock It To You" or "Sock It To Them," according to Entrepreneur. These names all convey the idea of colorful socks.
Keep Learning
"Sock It To Me" and "Sock It To You" are catchy names that convey the idea of colorful socks. "Sock It To Them" is another option, but it may be too direct.
"Sock It To Me" and "Sock It To You" are short, catchy names that are easy to remember. "Sock It To Them" is more direct, but it may be too direct.
"Sock It To Me" and "Sock It To You" are short, catchy names that are easy to remember. "Sock It To Them" is more direct, but it may be too direct.
"Sock it to me."
This certainly seem to be the case when running the instruction in the playground.
In Conclusion
AI21 labs is adding value to their LLM by crafting end-user applications, and has recently also launched an app. Owning the value-chain end-to-end is astute on two levels.
The first being that no money is left on the table and AI21 is deriving value from developers to end-users.
Secondly AI21 is also directing and informing the use of their LLM with practical use-cases and every day scenarios.
⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️
I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.