Hello World Agent: Your First Interactive Chatbot

Hello World Agent: Your First Interactive Chatbot

Cross the starting line of agent development. Build a simple, interactive CLI agent with a custom persona and learn the mechanics of maintaining a conversation state using the Gemini ADK.

Hello World Agent: Your First Interactive Chatbot

In the previous lesson, we connected our "Nervous System" to the Gemini brain. Now, we are going to give that brain a Mind. A simple "Ping" (where you send text and get text back) is a transaction. An Agent is an interaction.

In this lesson, we will build a "Hello World" agent using the start_chat interface of the Gemini ADK. We will implement a custom persona, build a continuous input/output loop in your terminal, and explore how the SDK manages conversation history automatically.


1. The Architecture of an Interactive Agent

To move from a one-off request to an interactive session, our code needs to handle three things:

  1. Identity Initialization: Giving the agent its system instructions.
  2. Stateful Session Management: Keeping track of previous turns.
  3. The User REPL: A "Read-Eval-Print Loop" that allows you to keep talking to the agent in your terminal.
graph TD
    A[Start Script] --> B[Initialize Model w/ Persona]
    B --> C[Start Chat Session]
    C --> D[Wait for User Input]
    D --> E[Send to Gemini]
    E --> F[Display Response]
    F --> D

2. Defining the Persona

For our first agent, we'll create "Orbit"—a helpful space exploration guide. Orbit shouldn't just answer questions; Orbit should speak with a specific tone and follow specific rules.

System Instructions for Orbit:

"You are Orbit, an AI guide specializing in human spaceflight. You are enthusiastic, use space-related analogies, and always provide a 'Fun Fact' at the end of every message. You keep your answers concise and professional."


3. The start_chat Method

The easiest way to build an agent in the Gemini ADK is using the start_chat() method.

Why use start_chat instead of generate_content?

  • Automatic History: When you use chat.send_message(), the SDK adds your message AND the model's response to an internal history list automatically.
  • Contextual Continuity: You don't have to manually pass all previous turns back to the model; the chat object takes care of the "Context Sliding" for you.

4. Implementation: The hello_world_agent.py

Create a new file named hello_world_agent.py. This script will run until you type "quit" or "exit."

import os
import google.generativeai as genai
from dotenv import load_dotenv

# 1. Setup
load_dotenv()
genai.configure(api_key=os.getenv("GEMINI_API_KEY"))

# 2. Define the Persona
SYSTEM_INSTRUCTION = (
    "You are Orbit, an AI guide for space exploration. "
    "Be enthusiastic, use puns, and always include a 'Fun Space Fact' at the end."
)

# 3. Initialize the Agent
model = genai.GenerativeModel(
    model_name='gemini-1.5-flash',
    system_instruction=SYSTEM_INSTRUCTION
)

# 4. Start the Stateful Session
# The 'history' starts empty.
chat = model.start_chat(history=[])

print("--- ORBIT IS ONLINE (Type 'quit' to exit) ---")

# 5. The REPL (Read-Eval-Print Loop)
while True:
    user_input = input("You: ")
    
    if user_input.lower() in ["quit", "exit"]:
        print("Orbit: Safe travels among the stars! Over and out.")
        break
        
    try:
        # Send message to model (History is handled for us!)
        response = chat.send_message(user_input, stream=True)
        
        print("Orbit: ", end="")
        for chunk in response:
            print(chunk.text, end="", flush=True)
        print("\n") # New line for next input
        
    except Exception as e:
        print(f"\nOrbit: Mission Control, we have a problem: {e}")

5. Key Concept: Streaming Responses

In the code above, we used stream=True and a for chunk in response loop.

Why Stream?

  1. Perceived Performance: Users hate waiting 5 seconds for a full block of text. Streaming starts displaying words as soon as they are calculated (usually in < 500ms).
  2. User Experience: It feels more "alive" and conversation-like.

6. Tuning Parameters: Temperature and Top_P

Even in a simple agent, you can control the "Creativity."

  • Temperature: If you find Orbit is too repetitive, increase the temperature to 0.8 or 1.0.
  • Max Tokens: If you want Orbit to be very brief (saving you money on tokens), set max_output_tokens to 150.
# Modified Initialization
config = genai.types.GenerationConfig(
    temperature=0.9,
    max_output_tokens=300
)

model = genai.GenerativeModel(
    model_name='gemini-1.5-flash',
    system_instruction=SYSTEM_INSTRUCTION,
    generation_config=config
)

7. Inspecting the Hidden State (The History)

At any point in your script, you can peek into the agent's memory by printing chat.history.

# Insert this after a few turns to see the internal logs
for message in chat.history:
    role = message.role
    text = message.parts[0].text
    print(f"[{role.upper()}]: {text[:50]}...")

This history is exactly what gets sent back to Gemini on every turn. If the history gets too long, Gemini might "forget" earlier turns—a problem we will solve in the next module with more advanced state management.


8. Summary and Exercises

You have just built your first stateful AI agent.

  • System Instructions provide the personality and identity.
  • start_chat manages the continuous memory of the session.
  • The REPL creates an interactive bridge between the human and the AI.
  • Streaming makes the interaction feel fluid and responsive.

Exercises

  1. Persona Modification: Change the System Instruction to transform Orbit into "Rust", a grumpy mining robot from an asteroid belt. How does the interaction change when you ask the same questions?
  2. Memory Test: Tell the agent your name in Turn 1. Ask a question about space in Turn 2. Then, in Turn 3, ask: "Wait, do you remember my name?" Note if the agent successfully recalls your name.
  3. Parameter Tuning: Set the temperature to 0.1 and ask the same question three times. Then set it to 1.5 and ask again. Describe the difference in "Predictability."

In the next lesson, we will upgrade Orbit by giving it Tools—allowing it to actually look up real-time space data or perform calculations.

Subscribe to our newsletter

Get the latest posts delivered right to your inbox.

Subscribe on LinkedIn