December 21, 2025·AI SecurityModule 10 Lesson 5: Grounding & HallucinationsWhen the truth is not enough. Learn how attackers use 'Hallucination Anchoring' and 'Fact-Fudging' to make AI lie confidently even with perfect data.