Module 4 Lesson 4: Memory Integration
Connecting the threads. How to implement chat history and windowed memory in LangChain agents.
Memory in LangChain: Following the Conversation
Without memory, an agent is a goldfish. It forgets the user's name as soon as the next message arrives. In LangChain, we use Memory Classes to automatically manage the "Conversation History" and inject it into the prompt.
1. How Memory Works in the Loop
- Input: User says "Hi, I'm Sudeep."
- Memory: Saves the interaction.
- Input 2: User says "What is my name?"
- Agent: Looks at the Memory, sees "Hi, I'm Sudeep," and answers correctly.
2. Types of LangChain Memory
A. ConversationBufferMemory
Saves every single word of the conversation.
- Pros: Perfectly accurate.
- Cons: Eventually overflows the LLM's context window.
B. ConversationBufferWindowMemory
Only keeps the last K interactions (e.g., the last 5 turns).
- Pros: Keeps the prompt small and fast.
- Cons: The agent will "forget" things that happened early in the session.
C. ConversationSummaryMemory
Uses an LLM to consolidate the history into a few sentences.
- Pros: Can handle massive conversations in a tiny prompt.
- Cons: Is computationally expensive because it calls an LLM every turn just to summarize.
3. Implementing Memory in an Agent
Since the AgentExecutor expects specific keys, we must map our memory correctly.
from langchain.memory import ConversationBufferMemory
from langchain.agents import AgentExecutor
# 1. Initialize Memory
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
# 2. Add to Executor
agent_executor = AgentExecutor(
agent=agent,
tools=tools,
memory=memory, # <--- Here!
verbose=True
)
4. Short-Term vs. Persistent Memory
The memory classes above are In-Memory. If you restart your Python script, the agent has amnesia. For production apps, you must connect this to a database like Redis, DynamoDB, or Postgres.
from langchain_community.chat_message_histories import RedisChatMessageHistory
history = RedisChatMessageHistory(session_id="user_123", url="redis://localhost:6379")
# Now your agent's memory persists even if the server restarts!
5. Visualizing the Memory Injection
graph LR
User[User Message] --> Prompt[Final Prompt Construction]
History[Memory Store] -->|"{chat_history}"| Prompt
Prompt --> LLM[LLM Brain]
LLM -->|Save Result| History
Key Takeaways
- Memory Classes automate the saving and retrieving of chat messages.
- Window Memory is the best balance for speed and relevance in simple agents.
- Summary Memory is required for long-running "Companion" style agents.
- External Storage (Redis/SQL) is mandatory for production-ready state persistence.