Module 3 Lesson 3: ChatPromptTemplate (Messaging Abstraction)
·LangChain

Module 3 Lesson 3: ChatPromptTemplate (Messaging Abstraction)

The Agent's Blueprint. How to create templates for multi-role conversations (System, Human, AI).

ChatPromptTemplate: Architecture of Conversation

In Module 2, we learned about Chat Models and their message types. While PromptTemplate handles a single string, ChatPromptTemplate handles an entire list of dynamic messages. This is the primary tool used by agentic developers to build professional chatbots.

1. Defining the Message List

Instead of a single string, we use a list of tuples: (role, content).

from langchain.prompts import ChatPromptTemplate

template = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant named {name}."),
    ("human", "Hi, can you help me with {task}?"),
])

# Format the variables
formatted = template.format_messages(
    name="Jarvis", 
    task="coding a website"
)

# This returns a LIST of Message objects
print(formatted)

2. Why this is superior to basic Strings

When you use .format_messages(), LangChain automatically creates SystemMessage and HumanMessage objects for you. You can pass the result directly to model.invoke().

response = model.invoke(formatted)

3. The MessagesPlaceholder

Sometimes you don't know how many messages will be in the conversation. You want to inject the Entire History. We use a placeholder for this.

from langchain.prompts import MessagesPlaceholder

template = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    MessagesPlaceholder(variable_name="history"),
    ("human", "{input}"),
])

4. Visualizing the Message Stack

graph TD
    S[System: You are a {role}]
    H[Human: Help with {query}]
    Stack[ChatPromptTemplate]
    Stack --> S
    Stack --> H
    Var[Vars: role='Chef', query='Eggs'] --> Stack
    Stack --> Final[List of Real Messages]
    Final --> LLM[Chat Model]

5. Engineering Tip: Standard Personas

Create a library of standard "System Personas."

  • DEBUG_PROMPT: "You are a senior debugger. Be critical."
  • SUPPORT_PROMPT: "You are a friendly support bot. Be polite." You can swap these templates into your ChatPromptTemplate to change the agent's behavior instantly.

Key Takeaways

  • ChatPromptTemplate is designed for the message-based world of modern LLMs.
  • It supports variables at every level (System, Human, and AI).
  • format_messages() is the key method to generate model-ready lists.
  • MessagesPlaceholder allows for dynamic insertion of conversation history.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn