
·AI Security
Module 10 Lesson 2: Document Injections
The trojan horse. Learn how attackers embed prompt injection payloads inside legitimate-looking documents to hijack RAG sessions during retrieval.
2 articles

The trojan horse. Learn how attackers embed prompt injection payloads inside legitimate-looking documents to hijack RAG sessions during retrieval.

How AI becomes an XSS vector. Learn how attackers use prompt injection to trick LLM-powered websites into rendering malicious scripts for other users.