Module 2 Lesson 4: Tool Invocation Basics
·Agentic AI

Module 2 Lesson 4: Tool Invocation Basics

The mechanics of action. How LLMs trigger external functions using structured JSON.

Tool Invocation: Giving the AI "Hands"

To an LLM, "Tools" don't actually exist as software. They exist as Descriptions. An LLM cannot "Click" a button; it can only "Write" a message that says, "I would like to click button X."

The process of taking that message and actually clicking the button is called Tool Invocation.

1. The Interaction Pattern

  1. Prompt: "Search for the weather in Tokyo."
  2. LLM Output: {"tool": "get_weather", "params": `{"city": "Tokyo"}`} (Usually in JSON).
  3. App Logic: Your code sees this JSON, runs the real fetch request to a weather API.
  4. Loop Input: Your code sends the result back to the LLM: "Observation: It is raining in Tokyo."

2. Function Calling (Native Support)

In the early days of AI, we had to "Beg" models to output JSON. Today, models like GPT-4 and Llama 3 have a native Function Calling mode.

  • You provide the model with a list of "Function Signatures."
  • The model outputs a special "Tool Call" object instead of plain text.

3. The Power of Descriptions

The Description of your tool is the most important piece of code you will write.

Bad Description: "search(query)" LLM failure: It doesn't know what to search for or what format the query should be.

Good Description: "search(query: str) - Searches the internal company wiki for HR and finance policies. Use specific keywords like 'vacation' or 'salary'." Outcome: The LLM now knows exactly when and how to use this tool.


4. Visualizing the Invocation Trace

sequenceDiagram
    participant U as User
    participant A as Agent (Code)
    participant L as LLM (Brain)
    participant T as Weather Tool
    
    U->>A: What's the temp in Paris?
    A->>L: User wants temp in Paris. Tools: [get_weather]
    L-->>A: TOOL_CALL: get_weather("Paris")
    A->>T: Run fetch(Paris)
    T-->>A: Result: "22C"
    A->>L: Observation: "22C"
    L-->>A: Final: "The temperature in Paris is 22C."
    A->>U: The temperature in Paris is 22C.

5. Code Example: A Minimal Tool Handler

import json

# The Tool (The "Hands")
def search_wiki(topic):
    database = {"vacation": "15 days", "maternity": "3 months"}
    return database.get(topic, "Topic not found")

# The Logic (The "Glue")
def process_agent_response(response_json):
    # 1. Parse the JSON
    data = json.loads(response_json)
    
    # 2. Check if the LLM wants a tool
    if "tool" in data:
        tool_name = data["tool"]
        arg = data["params"]["topic"]
        
        # 3. Invoke the actual code
        print(f"--- System: Executing {tool_name} for {arg} ---")
        result = search_wiki(arg)
        
        # 4. Return the observation to the LLM
        return f"Observation: {result}"

Key Takeaways

  • Tools are triggered by Structured Text (JSON), not magic.
  • Function Calling is a specialized model behavior for outputting tool parameters.
  • Descriptions are the documentation that the LLM reads; they must be clear and logical.
  • The "Tool Execution" happens in Your Code, not inside the LLM brain.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn