Module 2 Lesson 4: Installing Ollama
·AI & LLMs

Module 2 Lesson 4: Installing Ollama

Step-by-step installation guide for every platform. Get the service running and ready for models.

Installing Ollama: Step-by-Step

Now it’s time to get Ollama onto your machine. The process is remarkably simple, but we will go through each platform to ensure your environment is set up correctly.

1. Installation on macOS

  1. Download: Go to ollama.com/download and click the macOS button.
  2. Extract: Open the downloaded .zip file.
  3. Move: Drag the Ollama icon to your Applications folder.
  4. Run: Double-click Ollama in your Applications folder.
  5. Verify: You should see a small llama icon in your menu bar at the top of your screen.

2. Installation on Windows

  1. Download: Go to ollama.com/download and click the Windows button.
  2. Run Installer: Open the OllamaSetup.exe.
  3. Install: Click through the setup wizard.
  4. Run: Ollama will start automatically. You can find it in your system tray (bottom right, near the clock).
  5. Shell: Open PowerShell or Command Prompt to start using the CLI.

3. Installation on Linux

The Linux installation is a "one-liner." Open your terminal and run:

curl -fsSL https://ollama.com/install.sh | sh

What this script does:

  • Downloads the binary to /usr/local/bin/ollama.
  • Creates an ollama user and group.
  • Sets up a systemd service so Ollama starts on boot.
  • Detects NVIDIA GPUs and installs the necessary drivers if missing (and supported).

4. Verification Check

Regardless of your platform, the best way to check if everything worked is using the terminal. Open your terminal of choice (Terminal on Mac, PowerShell on Windows, Bash on Linux) and type:

ollama --version

If you see something like ollama version is 0.4.4 (or newer), congratulations! The server and CLI are ready.


Troubleshooting Common Issues

Issue: "Command Not Found" (macOS/Windows)

If you just installed it, you might need to restart your terminal window for the system to recognize the new ollama command.

Issue: GPU Not Detected

If Ollama is running very slowly (1 word per 2 seconds), it might be using your CPU.

  • Windows: Ensure your NVIDIA drivers are up to date.
  • macOS: Older Macs (Intel-based) do not have the same optimization as Apple Silicon (M1/M2/M3) and will inherently be slower.
  • Linux: Ensure your user is in the video group and that nvidia-smi works.

Next Steps

Now that the software is installed, we need to learn how to talk to it. In the next lesson, we will dive into the most common CLI commands you'll use every day.


Key Takeaways

  • Download the native apps for Windows and macOS.
  • Use the curl script for Linux and servers.
  • Verify the installation using ollama --version.
  • The System Tray (Windows) or Menu Bar (Mac) icon must be present for the CLI to talk to the server.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn