Module 15 Lesson 1: Packaging LangChain Apps
From Script to Service. How to organize your code and dependencies for reliable deployment on any server.
Packaging for the Cloud: Clean Projects
A professional LangChain app is not a single file. It is a Package. If you just send your app.py to a friend, it will fail because they don't have your 15 libraries or your .env file. Proper packaging ensures that your code runs identically on your laptop and in the cloud.
1. The Standard Project Structure
my-ai-app/
├── main.py # FastAPI entry point
├── chain_logic.py # Your LangChain definitions
├── requirements.txt # All your pip libraries
├── .env # Secrets (NOT pushed to GitHub!)
└── data/ # Local data/vector storage
2. Managing Dependencies
LangChain updates fast. If you don't "Pin" your versions, your code might break next month.
Bad requirements.txt:
langchain
langchain-openai
Good requirements.txt:
langchain==0.3.0
langchain-openai==0.2.1
python-dotenv
fastapi
uvicorn
3. Visualizing the Build Artifact
graph LR
Src[Source Code] --> Pack[Packaging Tool]
Deps[Dependencies] --> Pack
Pack --> Artifact[Distro / Image]
Artifact --> Production[Cloud Server]
4. Handling Environments
You should never hardcode your path to the vector store index. Use environment variables!
- Instead of:
db = FAISS.load_local("/Users/sudeep/apps/index") - Use:
db = FAISS.load_local(os.getenv("INDEX_PATH"))This allows you to move the folder on the server without changing the code.
5. Engineering Tip: pip freeze
When you have your app working perfectly, run:
pip freeze > requirements.txt
This automatically captures every single library and version currently in your virtual environment.
Key Takeaways
- Version Pinning is mandatory for AI library stability.
- Keep Logic (LangChain) separate from Transportation (FastAPI).
- Use Environment Variables for secrets and file paths.
- Clean project structure is the prerequisite for Dockerization.