
Non-Deterministic Behavior
Why your prompt works today but fails tomorrow.
Non-Deterministic Behavior
LLMs are probabilistic machines. If you ask the same question twice, you might get two different answers.
The Temperature Effect
At temperature=0, the model is mostly deterministic, but floating point drift on GPUs can still cause minor variations.
At temperature=1, the model takes creative liberties.
graph LR
Input[Input Prompt] --> LLM{LLM}
LLM -- "Run 1" --> OutputA[Output A: Correct]
LLM -- "Run 2" --> OutputB[Output B: Hallucination]
LLM -- "Run 3" --> OutputC[Output C: Refusal]
style OutputB fill:#ffcdd2,stroke:#d32f2f
style OutputA fill:#c8e6c9,stroke:#2e7d32
The Chain Reaction
In a linear chain A -> B -> C, a small variation in A becomes a large deviation in B, and a complete failure in C.
- Step 1: "Summarize this." -> (Variation: Skips a key detail).
- Step 2: "Extract entities from summary." -> (Misses the entity entirely).
- Step 3: "Look up entity in DB." -> Crash.
Graph structures help mitigate this by adding validation loops (if C crashes, go back to A), treating non-determinism as a feature to be managed rather than a bug to be feared.