Why FastAPI is Fast

Why FastAPI is Fast

Deconstructing the engine. Learn about ASGI vs WSGI, the async-first architecture, and the Starlette/Pydantic foundations that make FastAPI high-performance.

Why FastAPI is Fast: Deconstructing the Engine

In the previous lesson, we established that FastAPI is one of the fastest Python frameworks available. But "fast" is a relative term. To build production-grade systems, we need to understand the mechanics of that speed.

FastAPI isn't fast by accident. It is fast by Architecture.


1. ASGI vs. WSGI: The Bridge to Async

To understand FastAPI, you must understand the evolution of how Python talks to web servers.

WSGI (Web Server Gateway Interface)

Legacy frameworks like Flask and Django (before 3.0) use WSGI. WSGI is synchronous. It handles one request per thread. If a request takes 5 seconds to query a database, that thread is "blocked"—it can't do anything else.

  • The Problem: High concurrency requires thousands of threads, which consumes massive memory.

ASGI (Asynchronous Server Gateway Interface)

FastAPI uses ASGI. ASGI is the spiritual successor to WSGI, designed to handle asynchronous protocols like WebSockets and HTTP/2.

  • The Solution: Instead of blocking a thread, an ASGI server (like Uvicorn) "pauses" the request while waiting for I/O (like a DB query) and immediately picks up another task.
graph LR
    subgraph "WSGI (Sync)"
    W1[Request 1] --> T1[Thread 1]
    T1 -- Blocked by DB --> T1
    W2[Request 2] --> T2[Thread 2]
    end
    
    subgraph "ASGI (Async)"
    A1[Request 1] --> E[Event Loop]
    A2[Request 2] --> E
    E -- Wait for DB --> A1
    E -- Handle Request --> A2
    end

2. The Foundation: Starlette

FastAPI is not built from scratch. It is a powerful wrapper around Starlette.

Starlette is a lightweight ASGI framework/toolkit. It handles the "Web" part:

  • Routing
  • WebSockets support
  • Authentication and Permissions
  • Session and Cookie support

By using Starlette, FastAPI inherits years of optimization and a proven, high-concurrency core.


3. The Data Engine: Pydantic

While Starlette handles the networking, Pydantic handles the data.

Most web frameworks spend a significant amount of time parsing JSON strings into Python objects and validating them.

  • Old Way: Manual checks, if 'key' in data, etc.
  • FastAPI Way: Uses Pydantic to perform validation at the hardware-accelerated C level (in Pydantic v2).

Pydantic allows FastAPI to:

  1. Validate data types instantly.
  2. Serialize/Deserialize JSON with extreme efficiency.
  3. Generate JSON Schemas for documentation automatically.

4. Native Async/Await

FastAPI is designed to be "Async-First." This means you can define your endpoints with the async def syntax.

@app.get("/items")
async def read_items():
    # This doesn't block the server!
    data = await database.fetch_all()
    return data

Because FastAPI doesn't need to create a new thread for every request, it can handle thousands of concurrent connections on a single machine where Flask might struggle with a hundred.


Performance Benchmark (Simplified)

FrameworkRequests Per Second (RPS)Latency (ms)
NodeJS~80,000Low
Go~100,000Very Low
FastAPI~75,000Low
Flask~5,000Moderate

Summary

FastAPI's speed comes from three main pillars:

  1. ASGI Architecture: Non-blocking I/O handling via the event loop.
  2. Starlette Core: A high-performance web toolkit.
  3. Pydantic Engine: Extremely fast data validation and serialization.

Understanding these foundations allows you to write code that actually leverages this speed instead of accidentally creating bottlenecks.

In the next lesson, we'll see where this speed is best applied in Real-World Use Cases (Microservices, AI, and SPAs).


Exercise: The Event Loop

If you have a function that calculates a complex mathematical formula that takes 2 seconds of CPU time, should you use async def or def? Hint: Think about whether the "Event Loop" can do anything else while the CPU is busy with math.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn