Module 2 Lesson 5: Ollama CLI Basics
Mastering the command line. A guide to pull, run, list, and manage models directly from your terminal.
Ollama CLI Basics: Your Control Center
The Command Line Interface (CLI) is where the magic happens. While there are many GUIs for Ollama, knowing the core commands will make you a much more efficient AI developer.
Here are the "Essential Seven" commands you need to know.
1. ollama run [model]
This is the most popular command. It does two things:
- Checks if you have the model downloaded. If not, it downloads it.
- Starts an interactive chat session in your terminal.
Example:
ollama run llama3
2. ollama pull [model]
Use this when you want to download a model in the background without starting a chat. This is great for preparing your system before a flight or a demo.
Example:
ollama pull mistral
3. ollama list (or ollama ls)
Shows every model currently stored on your SSD. It also tells you the size of each model and when it was last updated.
Example:
ollama list
4. ollama ps
Shows which models are currently active in RAM. If you are running three different apps using Ollama, this command will help you see which models are eating your memory.
Example:
ollama ps
5. ollama rm [model]
Models are large. If you are running out of disk space, use rm to delete models you no longer need.
Example:
ollama rm gemma
6. ollama cp [source] [destination]
Copies a model. This is very useful when you want to take a base model (like llama3) and create a customized version of it with a different system prompt (which we will do in Module 5).
Example:
ollama cp llama3 my-private-bot
7. ollama serve
If you are on Linux or running in a headless environment, you use serve to start the backend engine. (Windows and Mac users typically don't need this, as the app handles it automatically).
Navigating the Interactive Chat
When you are inside an ollama run session, there are special "slash commands" to control the chat:
/?: Shows help./set system "...": Changes the system prompt on the fly./show info: Displays the model's technical specs./bye: Exits the chat and returns to your terminal.
Summary Cheat Sheet
| Command | Action |
|---|---|
ollama run | Download & Start Chat |
ollama pull | Download Only |
ollama list | Show downloaded models |
ollama ps | Show models in RAM |
ollama rm | Delete a model |
ollama --help | Show all CLI options |
Practice Exercise
Open your terminal and try to run ollama list. If you see an empty list, don't worry—in the next module, we will fill it up with the world's most powerful open-source models!
Key Takeaways
- The CLI is the direct way to communicate with the Ollama server.
runis for interaction;pullis for preparation.psandlistare your primary diagnostic tools for memory and storage.- Use
/byeto exit a running session.