Module 2 Lesson 4: Installing Ollama
Step-by-step installation guide for every platform. Get the service running and ready for models.
Installing Ollama: Step-by-Step
Now it’s time to get Ollama onto your machine. The process is remarkably simple, but we will go through each platform to ensure your environment is set up correctly.
1. Installation on macOS
- Download: Go to ollama.com/download and click the macOS button.
- Extract: Open the downloaded
.zipfile. - Move: Drag the
Ollamaicon to your Applications folder. - Run: Double-click
Ollamain your Applications folder. - Verify: You should see a small llama icon in your menu bar at the top of your screen.
2. Installation on Windows
- Download: Go to ollama.com/download and click the Windows button.
- Run Installer: Open the
OllamaSetup.exe. - Install: Click through the setup wizard.
- Run: Ollama will start automatically. You can find it in your system tray (bottom right, near the clock).
- Shell: Open PowerShell or Command Prompt to start using the CLI.
3. Installation on Linux
The Linux installation is a "one-liner." Open your terminal and run:
curl -fsSL https://ollama.com/install.sh | sh
What this script does:
- Downloads the binary to
/usr/local/bin/ollama. - Creates an
ollamauser and group. - Sets up a systemd service so Ollama starts on boot.
- Detects NVIDIA GPUs and installs the necessary drivers if missing (and supported).
4. Verification Check
Regardless of your platform, the best way to check if everything worked is using the terminal. Open your terminal of choice (Terminal on Mac, PowerShell on Windows, Bash on Linux) and type:
ollama --version
If you see something like ollama version is 0.4.4 (or newer), congratulations! The server and CLI are ready.
Troubleshooting Common Issues
Issue: "Command Not Found" (macOS/Windows)
If you just installed it, you might need to restart your terminal window for the system to recognize the new ollama command.
Issue: GPU Not Detected
If Ollama is running very slowly (1 word per 2 seconds), it might be using your CPU.
- Windows: Ensure your NVIDIA drivers are up to date.
- macOS: Older Macs (Intel-based) do not have the same optimization as Apple Silicon (M1/M2/M3) and will inherently be slower.
- Linux: Ensure your user is in the
videogroup and thatnvidia-smiworks.
Next Steps
Now that the software is installed, we need to learn how to talk to it. In the next lesson, we will dive into the most common CLI commands you'll use every day.
Key Takeaways
- Download the native apps for Windows and macOS.
- Use the curl script for Linux and servers.
- Verify the installation using
ollama --version. - The System Tray (Windows) or Menu Bar (Mac) icon must be present for the CLI to talk to the server.