Module 10 Lesson 4: Load Balancing with Nginx
·DevOps

Module 10 Lesson 4: Load Balancing with Nginx

Handle the traffic spike. Learn how to use an Nginx container as a 'Reverse Proxy' to distribute incoming requests across multiple app containers.

Module 10 Lesson 4: Load Balancing with Nginx

What happens when your one application container can't handle all the users? You scale up to 3 containers. But how does the user know which one to talk to? You use a Load Balancer.

1. The Reverse Proxy Pattern

An Nginx container sits in front of your app.

  1. Users talk to Nginx on port 80.
  2. Nginx looks at its "Upstream" list.
  3. Nginx forwards the request to App 1, App 2, or App 3 (often using "Round Robin" logic).

2. Configuration Example

You would create an nginx.conf like this:

upstream myapp {
    server app1:3000;
    server app2:3000;
    server app3:3000;
}

server {
    listen 80;
    location / {
        proxy_pass http://myapp;
    }
}

3. Scaling with Docker Compose

Docker Compose can automate this using the --scale flag.

docker-compose up --scale web=3 -d
  • This starts 3 copies of the web container.
  • The DNS Trick: Docker's internal DNS will now return multiple IP addresses for the hostname web. Most load balancers (including Nginx) will automatically rotate through these IPs.

4. Why Nginx?

  1. SSL Termination: You can keep your apps simple (HTTP) and let Nginx handle the complex HTTPS certificates (using Certbot/Let's Encrypt).
  2. Health Checks: Nginx can stop sending traffic to a container if it stops responding.
  3. Caching: Nginx can store copies of your images/CSS to speed up the site.

Exercise: The Traffic Splitter

  1. Create a docker-compose.yml with two services: app (using hashicorp/http-echo) and proxy (using nginx).
  2. Set the TEXT env var for app to "I am a container".
  3. Scale the app to 3 instances: docker-compose up --scale app=3 -d.
  4. Visit the proxy port in your browser.
  5. Check the logs: docker-compose logs -f app.
  6. Refresh your browser 10 times. Do you see the requests being distributed across all three app IDs?

Summary

Load balancing is the key to High Availability. By placing a professional proxy like Nginx in front of your containers, you can scale horizontally, handle SSL securely, and ensure that your site stays up even if one individual container crashes.

Next Lesson: Locking it down: Securing container communications.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn