
Communication Protocols: The Agentic DSL
Learn how to optimize inter-agent communication. Master the 'Technical Protocol' and 'Shared State' patterns that eliminate conversational overhead.
Communication Protocols: The Agentic DSL
When people collaborate, they use natural language full of politeness and context. When agents collaborate, politeness is a Financial Burden.
Example of Waste (Assistant to Specialist):
"Hello Research Agent. I hope you are well. Could you please find the price of gold today and report back to me in a JSON format? Thank you so much!" (45 tokens)
Example of Efficiency (The DSL):
REQ: gold_price | FMT: JSON(7 tokens)
In this lesson, we master the Agentic DSL (Domain Specific Language). We’ll learn how to build a strict, high-density protocol for inter-agent messaging that cuts communication costs by 80%.
1. The "Standard Header" Pattern
Just like HTTP has headers (Content-Type, Auth), your agentic system should have internal headers.
The Protocol:
T: Task IDC: Context Level (1-5)I: Input DataO: Output Expectation
By sticking to this "Standard Header," you remove the need for the model to "Parse intent" from a sentence. The intent is fixed in the structure.
2. Using "Internal Call" Markers
In a multi-agent graph (LangGraph), you can use a specific symbol (like @) to trigger a specialist.
Agent to Specialist:
@searcher: "Apple Stock 2024"
Specialist to Agent:
RET: 150.22
This "Chatbot Shorthand" is perfectly understood by LLMs if they are instructed as "Technical Operators" in the system prompt.
3. Implementation: The Protocol Formatter (Python)
Python Code: Minifying Inter-Agent Messages
def format_inter_agent_msg(agent_name, task_body, output_format="text"):
"""
Wraps a task in a token-dense protocol string.
"""
protocol = f"[@{agent_name}]"
protocol += f" B:{task_body}" # B for Body
protocol += f" F:{output_format}" # F for Format
return protocol
# Result: [@coder] B:fix_indentation_err F:diff
# VS: "Hey coder agent, please fix the indentation in this file and give me a git diff."
4. The "No-Preamble" Constraint
In your system prompt for every agent in the fleet, you must include a "Protocol Restriction."
The Preamble Constraint:
"INTERNAL COMMUNICATION ONLY. You are talking to another AI. DO NOT use greetings, signatures, or filler words. Output ONLY the protocol string. Failure to be concise costs money."
5. Token Savings: Scaling the Conversation
In a multi-agent swarm where 5 agents talk to each other to finish one task, "DSL Communication" can save thousands of tokens per user session.
| Protocol | Avg Tokens/Msg | Total (Swarm of 20 Msgs) |
|---|---|---|
| Natural Language | 60 | 1,200 tokens |
| DSL (Shorthand) | 10 | 200 tokens |
| Savings | - | 1,000 tokens |
6. Summary and Key Takeaways
- Delete Politeness: AI-to-AI talk should be technical, not conversational.
- Define a Header: Use single-letter keys for task metadata (
T,B,F). - Symbols over Words: Use
@nameto address specialists. - Enforce at System Level: Make conciseness a primary directive for all agents.
In the next lesson, Shared Context vs. Private Context, we look at چگونه to decide which agent knows what to prevent "Information Oversaturation."
Exercise: The Protocol Rewrite
- Take a typical "Agent Instruction": "Dear SQL Agent, I need you to select the top 10 users from the database and send them back as a Markdown table."
- Rewrite it in DSL format.
- Validate: Ask an LLM to "Decode" your DSL.
- Does it correctly identify the Role, the Task, and the Format?
- (Usually, Yes).
- Count the savings. (It's usually ~75%).