This article shows how to use prompt template in LangChain to create reusable prompts for models. In the post First LangChain Program: Ask Me Anything we saw how to connect to different models using LangChain standard interfaces and pass user’s prompt. But what limits the prompt there is that it is hardcoded. What if you want to give user a chance to create dynamic prompts where certain values can be passed at runtime? For example-
"Write a 2-page blog post on the topic {topic}"
Here topic is a placeholder which can be replaced by the actual value later.
Or
"Write a {programming_language} program for {topic}"
Where user can supply the choice of language and the topic later to be replaced in the prompt before sending it to the model.
Creating prompt templates in LangChain
LangChain gives two options for creating such dynamic prompt templates.
- PromptTemplate- This class is best used for single-message, text-completion-style prompts. A prompt template consists of a string template with placeholders and formats it with input values.
- ChatPromptTemplate- ChatPromptTemplate is best for multi-role conversations. It uses a list of messages, each with a defined role (system, human, ai, etc).
PromptTemplate in LangChain
PromptTemplate is a utility for creating structured prompts for LLMs. It lets you define a template string with placeholders which can be replaced by actual values later.
LangChain PromptTemplate example
Let’s say you want to create an AI blog post generator which can write a blog post for the given topic. For this example, I have used a locally deployed "llama3.1" model using Ollama.
from langchain_core.prompts import PromptTemplate
from langchain_ollama import ChatOllama
# Define a prompt template for generating a blog post
template = """
You are an expert technical content writer. Write a detailed blog post as per the given instructions. The blog post should be engaging, informative, and well-structured. Include an introduction, main body, and conclusion. Use subheadings where appropriate and provide examples to illustrate key points.
***Instructions***: Write a {no_of_paras} paragraphs blog post about the following topic: {topic}.
"""
# Create a PromptTemplate object
prompt = PromptTemplate.from_template(template)
# Initialize the Ollama model
model = ChatOllama(model="llama3.1",
temperature=0.7)
# Define the topic and number of paragraphs for the blog post
topic = "Dictionary in Python"
no_of_paras = 6
# Format the prompt with the topic and number of paragraphs
formatted_prompt = prompt.format(topic=topic, no_of_paras=no_of_paras)
# Print the formatted prompt to verify its correctness
print("Formatted Prompt:\n", formatted_prompt)
# Generate the blog post using the model
response = model.invoke(formatted_prompt)
# Print the generated blog post
print("\nGenerated Blog Post:\n", response.content)
Formatter prompt output is as given below-
“You are an expert technical content writer. Write a detailed blog post as per the given instructions. The blog post should be engaging, informative, and well-structured. Include an introduction, main body, and conclusion. Use subheadings where appropriate and provide examples to illustrate key points. ***Instructions***: Write a 6 paragraphs blog post about the following topic: Dictionary in Python.
ChatPromptTemplate in LangChain
ChatPromptTemplate in LangChain is a class for building structured prompts for chat models. You define messages with roles like system, human, and ai.
For example-
template = ChatPromptTemplate(
[
("system", "You are a helpful AI bot. Your name is {name}."),
("human", "Hello, how are you doing?"),
("ai", "I'm doing well, thanks!"),
("human", "{user_input}"),
]
)
There are specific prompt template utility classes also to be used for creating different kind of messages.
- AIMessagePromptTemplate- The AIMessagePromptTemplate class in LangChain is used to create a template for messages that are assumed to be from an AI within a chat conversation.
- HumanMessagePromptTemplate- The HumanMessagePromptTemplate class in LangChain is used to create a template for messages that are from user.
- SystemMessagePromptTemplate- The SystemMessagePromptTemplate in LangChain is a class used to define a system-level instruction (set context) for a language model within a chat application.
LangChain ChatPromptTemplate example
We’ll take the previous example of the blog generation itself with one change. In the previous example context for the system is always set as "technical content writer". In this example, we’ll set this expertise later; in the beginning it is also a placeholder now. Let’s see how to do it.
For creating templates separate file prompt.py is used.
prompt.py
system_prompt_template = """
You are an expert {expertise} content writer. Write a detailed blog post as per the given
instructions. The blog post should be engaging, informative, and well-structured. Include an introduction, main body, and conclusion. Use subheadings where appropriate and provide examples to illustrate key points.
"""
human_prompt_template = """
***Instructions***: Write a {no_of_paras} paragraphs blog post about the following topic: {topic}.
"""
As you can see, we have two separate templates for system and human respectively with total three placeholders {expertise}, {no_of_paras}, {topic}
chattemplatedemo.py
This is the class where both of the templates are imported from prompt.py file and used to create messages for System and Human roles. I have used a locally deployed “llama3.1” model using Ollama.
from langchain_core.prompts import ChatPromptTemplate, SystemMessagePromptTemplate, HumanMessagePromptTemplate
from langchain_ollama import ChatOllama
from prompt import system_prompt_template, human_prompt_template
system_message = SystemMessagePromptTemplate.from_template(system_prompt_template)
human_message = HumanMessagePromptTemplate.from_template(human_prompt_template)
# Create a ChatPromptTemplate object
prompt = ChatPromptTemplate.from_messages([system_message, human_message])
# Initialize the Ollama model
model = ChatOllama(model="llama3.1",
temperature=0.7)
# Define the topic and number of paragraphs for the blog post
topic = "What is an ideal duration to keep any stock in your portfolio?"
no_of_paras = 3
expertise = "financial"
# Format the prompt with the topic and number of paragraphs
formatted_prompt = prompt.format(topic=topic, no_of_paras=no_of_paras, expertise=expertise)
# Print the formatted prompt to verify its correctness
print("Formatted Prompt:\n", formatted_prompt)
# Generate the blog post using the model
response = model.invoke(formatted_prompt)
# Print the generated blog post
print("\nGenerated Blog Post:\n", response.content)
That’s how the formatted prompt with inserted values looks like.
Formatted Prompt: System: You are an expert financial content writer. Write a detailed blog post as per the given instructions. The blog post should be engaging, informative, and well-structured. Include an introduction, main body, and conclusion. Use subheadings where appropriate and provide examples to illustrate key points. Human: ***Instructions***: Write a 3 paragraphs blog post about the following topic: What is an ideal duration to keep any stock in your portfolio?.
That's all for this topic Prompt Templates in LangChain With Examples. If you have any doubt or any suggestions to make please drop a comment. Thanks!
Related Topics
You may also like-
No comments:
Post a Comment