Skip to main content
Five progressively more capable examples. Each builds on the previous.

1. Model Call

Call an LLM directly:
from definable.model.openai import OpenAIChat
from definable.model.message import Message

model = OpenAIChat(id="gpt-4o")
response = model.invoke(
    messages=[Message(role="user", content="What is the capital of France?")],
    assistant_message=Message(role="assistant", content=""),
)
print(response.content)

2. Agent with Tools

Add tools so the agent can take actions:
from definable.agent import Agent
from definable.tool.decorator import tool

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"The weather in {city} is sunny, 72F."

agent = Agent(
    model="gpt-4o",
    tools=[get_weather],
    instructions="You are a helpful weather assistant.",
)

output = agent.run("What's the weather in San Francisco?")
print(output.content)

3. Agent with Knowledge (RAG)

Ground the agent in your documents:
from definable.agent import Agent
from definable.knowledge import Knowledge
from definable.vectordb import InMemoryVectorDB

knowledge = Knowledge(vector_db=InMemoryVectorDB())
knowledge.add("Definable supports 10 LLM providers including OpenAI and Anthropic.")
knowledge.add("Agents can use tools, knowledge, memory, and middleware.")

agent = Agent(
    model="gpt-4o",
    knowledge=knowledge,
    instructions="Answer questions using the provided knowledge.",
)

output = agent.run("What LLM providers does Definable support?")
print(output.content)

4. Agent with Memory

Give the agent persistent memory across conversations:
from definable.agent import Agent
from definable.memory import Memory, SQLiteStore

agent = Agent(
    model="gpt-4o",
    instructions="You are a helpful assistant with memory.",
    memory=Memory(store=SQLiteStore("./memory.db")),
)

output = agent.run("My name is Alice and I work at Acme.", user_id="alice")
output = agent.run("Where do I work?", user_id="alice")
print(output.content)  # "You work at Acme."

5. Streaming

Stream tokens as they are generated:
from definable.agent import Agent

agent = Agent(model="gpt-4o", instructions="You are a helpful assistant.")

for event in agent.run_stream("Tell me a story about AI."):
    if event.event == "RunContent" and event.content:
        print(event.content, end="", flush=True)

What’s Next

Agents

The core execution loop with tools, middleware, and tracing.

Teams

Coordinate multiple agents to collaborate or divide work.

Workflows

Orchestrate agents through structured step pipelines.

Models

10 providers with streaming, structured output, and vision.

Tools

Custom tools with hooks, caching, and dependency injection.

Knowledge

Full RAG pipeline with readers, chunkers, and vector databases.