Chain-Of-Thought Prompting In LLMs
In principle chain-of-thought prompting allows for the decomposition of multi-step requests into intermediate steps.
Chain-of-thought prompting enables large language models (LLMs) to address complex tasks like common sense reasoning and arithmetic.
Establishing chain-of-thought reasoning via prompt engineering and instructing the LLM accordingly is quite straight-forward to implement.
Inference can be established via chain-of-thought prompting.
Below is a very good illustration of standard LLM prompting on the left, and chain-of-thought prompting on the right.
What is particularly helpful of Chain-Of-Thought Prompting is that by decomposing the LLM input and LLM output, it creates a window of insight and interpretation.
This Window of decomposition allows for manageable granularity for both input and output, and tweaking the system is made easier.
Chain-Of-Thought Prompting is ideal for contextual reasoning like word problems, common-sense reasoning, math word problems and is very much applicable to any task that we as humans can solve via language.
The image below shows a comparison of percentage solve rate based on standard prompting and chain-of-thought prompting.
Chain of thought reasoning provides reasoning demonstration examples via prompt engineering.
In essence chain of thought reasoning can be achieve by creating intermediate reasoning steps to incorporate in the prompt.
The ability of LLMs to perform complex reasoning improves significantly.
⭐️ Please follow me on LinkedIn for updates on Conversational AI ⭐️
I’m currently the Chief Evangelist @ HumanFirst. I explore and write about all things at the intersection of AI and language; ranging from LLMs, Chatbots, Voicebots, Development Frameworks, Data-Centric latent spaces and more.
NLU design tooling
“Conversation Designer, Retail, 10k+ employees The tool that turned conversation designers, into NLU designers” ★★★★★…
Get an email whenever Cobus Greyling publishes.
Get an email whenever Cobus Greyling publishes. By signing up, you will create a Medium account if you don’t already…
The Cobus Quadrant™ Of NLU Design
NLU design is vital to planning and continuously improving Conversational AI experiences.
The Cobus Quadrant™ Of Conversation Design Capabilities
∗ This is part one of a two part series, please also take a look part two, the Cobus Quadrant of NLU Design.
Scaleable Prompt Pipelines For LLMs
In a previous post I wrote about the evolution of prompt engineering. Creating highly scaleable & enterprise grade LLM…
The Evolution Of Prompt Engineering
As functionality grows, invariably the complexity of prompt engineering will grow. Here I explain how complexity can be…
Generative AI Prompt Pipelines
Prompt Pipelines extend prompt templates by automatically injecting contextual reference data for each prompt.