Module 5 Lesson 1: What a Modelfile Is
·AI & LLMs

Module 5 Lesson 1: What a Modelfile Is

The blueprint of a model. Understanding how to configure your AI using simple text files.

What a Modelfile Is: Your AI's DNA

Until now, we have been using "Standard" models exactly as they came from the registry. But what if you want a model that always speaks in a specific tone, uses a specific format, or has a larger context window by default?

To do this, you use a Modelfile.

1. The "Dockerfile" for AI

If you are familiar with Docker, you know that a Dockerfile contains a list of instructions to build a container. A Modelfile is exactly the same concept for LLMs.

It is a simple plain-text file that tells Ollama:

  1. Which base model to use (e.g., Llama 3).
  2. What the system instructions are (e.g., "You are a helpful assistant").
  3. What the technical parameters are (e.g., Temperature, Context window).
  4. What the "Stop sequences" are (e.g., Stop when you see the word "End").

2. Why use a Modelfile?

Reason A: Consistency

If you are building an app, you don't want to send a massive "System Prompt" every time the user sends a message. By using a Modelfile, you "bake" that personality into the model itself. You just run ollama run my-custom-model.

Reason B: Technical Optimization

You can create a version of Llama 3 that always defaults to a 32k context window or a 0.2 temperature, so you don't have to set those parameters manually every time you open the terminal.

Reason C: Security and Privacy

You can bake internal company rules or "Guardrails" into the Modelfile so that the model refuses to answer questions outside of its intended scope.


3. The Lifecycle of a Custom Model

The workflow follows three steps:

  1. Write: Create a file named Modelfile (no extension) with your instructions.
  2. Create: Run ollama create [new-name] -f Modelfile. This "compiles" your instructions into a new model entry.
  3. Run: Use ollama run [new-name] to start chatting with your creation.

4. A Simple Example Preview

Here is what a basic Modelfile looks like:

FROM llama3
SYSTEM "You are a polite British butler who answers every question with dignity."
PARAMETER temperature 0.2

In the next lesson, we will break down this syntax line-by-line so you can start building your own.


Key Takeaways

  • A Modelfile is a configuration script for creating custom models.
  • It allows you to bake in system prompts and technical parameters.
  • It enables consistency and reproducibility in AI development.
  • The ollama create command is used to turn a Modelfile into a runnable model.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn