Module 8 Lesson 1: Why Memory is Needed
·LangChain

Module 8 Lesson 1: Why Memory is Needed

Breaking the Amnesia. Understanding why LLMs are stateless and how we provide 'history' to simulate a conversation.

The Memory Problem: Breaking the Amnesia

Large Language Models are Stateless. This means they have no memory of anything that happened outside of the current request.

  • Request 1: "My name is Sudeep."
  • Request 2: "What is my name?" $\rightarrow$ Answer: "I don't know."

If you want a chatbot to feel like a person, you have to "Remind" it of the previous conversation every time you send a new prompt.

1. State vs. LLM

Imagine the LLM is a math teacher. Every time you ask a question, the teacher enters the room with a blank mind. If you want the teacher to remember your previous question, you have to hand them a Summary or a Transcript of your previous conversation before they sit down.


2. Managing the Context Window

You can't just send the entire history of a 1-hour chat. Eventually, you will hit the Context Window Limit (Module 2).

  • Goal: Keep the history short, relevant, and dense.

3. Visualizing the Remainder

graph LR
    User[Query 2: 'Repeat my name'] --> App[Your App]
    History[DB: 'User said his name is Sudeep'] --> App
    App -->|Merged| Prompt[Prompt: 'Conversation: ... User: Repeat name']
    Prompt --> LLM[Chat Model]
    LLM --> Result['Your name is Sudeep']

4. The BaseChatMemory Class

LangChain provides a standard interface for this "Reminder" logic.

  1. save_context: Writes a pair of human/AI messages to the database.
  2. load_memory_variables: Reads the history and prepares it for the prompt.

5. Engineering Tip: Session IDs

In a production app with 1,000 users, you need a Session ID for every conversation. You don't want User A's memory leaking into User B's prompt! This is handled using External Databases like Redis or Postgres to store the memory strings.


Key Takeaways

  • LLMs are stateless and do not remember past requests.
  • Memory works by appending past messages to the current prompt.
  • We must manage the Context Window to prevent expensive or failing prompts.
  • Session IDs are the key to multi-user memory isolation.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn