Module 12 Lesson 1: Introduction to Callbacks
Listening to the Chain. How to use the Callback system to intercept events like 'LLM Start' or 'Tool End' for logging and UI updates.
Callbacks: The Interceptor System
When you call chain.invoke(), it acts like a "Black Box." It starts, it thinks, and then it returns an answer. If something goes wrong inside, you have no idea what happened. Callbacks allow you to "Hook" into specific moments in the chain's lifecycle.
1. What can you "Hook"?
LangChain emits "Events" for everything:
on_llm_start: Fired when the prompt is sent to OpenAI.on_tool_start: Fired when the agent decides to use a tool.on_llm_end: Fired when the AI finishes its answer.on_chain_error: Fired when everything breaks.
2. Why use Callbacks?
- Logging: Save every prompt and response to a file for auditing.
- Streaming: Send tokens to a frontend UI in real-time.
- Cost Monitoring: Calculate token usage after every individual call.
- UI Updates: Show a "Thinking..." spinner while the tool is running.
3. Visualizing the Interceptor
graph TD
Start[User Invoke] --> C1[Event: Chain Start]
C1 --> P[Prompt Ready]
P --> C2[Event: LLM Start]
C2 --> M[Model Processing]
M --> C3[Event: LLM End]
C3 --> Out[Final Response]
C1 -.-> Handler[Your Logging Code]
C2 -.-> Handler
C3 -.-> Handler
4. The BaseCallbackHandler
To "Listen" to these events, you create a class that inherits from BaseCallbackHandler.
from langchain_core.callbacks import BaseCallbackHandler
class MyCustomHandler(BaseCallbackHandler):
def on_llm_start(self, serialized, prompts, **kwargs):
print(f"I am about to send this prompt to the AI: {prompts[0]}")
def on_llm_end(self, response, **kwargs):
print("The AI has finished its answer!")
5. Engineering Tip: Global vs. Tool callbacks
You can attach a callback to the entire chain or just to a specific tool. For example, you might want to log every time the "Bank Transfer" tool is used, but you don't care about logging the "Math" tool.
Key Takeaways
- Callbacks provide observability into the AI's internal lifecycle.
- They are event-driven (Start, End, Error).
- Use them for Logging, Monitoring, and UI feedback.
- Decoupling: Callbacks allow you to add logging without changing your prompt logic.