Skip to main content

Install

pip install definable

Set Your API Key

export OPENAI_API_KEY="sk-..."

Your First Model Call

Call an LLM with three lines of code:
from definable.models import OpenAIChat

model = OpenAIChat(id="gpt-4o")
response = model.invoke(messages=[{"role": "user", "content": "Hello!"}])
print(response.content)

Your First Agent

Create an agent with tools that can take actions:
from definable.models import OpenAIChat
from definable.tools import tool
from definable.agents import Agent

@tool
def get_weather(city: str) -> str:
    """Get the current weather for a city."""
    return f"The weather in {city} is sunny, 72°F."

agent = Agent(
    model=OpenAIChat(id="gpt-4o"),
    tools=[get_weather],
    instructions="You are a helpful weather assistant.",
)

output = agent.run("What's the weather in San Francisco?")
print(output.content)

Your First RAG Query

Build a knowledge base and search it:
from definable.knowledge import Knowledge, InMemoryVectorDB, OpenAIEmbedder

knowledge = Knowledge(
    vector_db=InMemoryVectorDB(),
    embedder=OpenAIEmbedder(),
)

# Add documents
knowledge.add("Definable is a Python framework for building AI agents.")
knowledge.add("It supports OpenAI, DeepSeek, Moonshot, and xAI models.")

# Search
results = knowledge.search("What models does Definable support?")
for doc in results:
    print(doc.content)

Your First Memory-Enabled Agent

Give an agent persistent memory that works across conversations:
from definable.agents import Agent
from definable.memory import CognitiveMemory, SQLiteMemoryStore
from definable.models import OpenAIChat

memory = CognitiveMemory(store=SQLiteMemoryStore(db_path="./memory.db"))

agent = Agent(
    model=OpenAIChat(id="gpt-4o"),
    instructions="You are a helpful assistant with persistent memory.",
    memory=memory,
)

# The agent remembers this across conversations
output = agent.run("My name is Alice and I work at Acme Corp.", user_id="alice")

# Later — the agent recalls the information
output = agent.run("Where do I work?", user_id="alice")
print(output.content)  # "You work at Acme Corp."

Stream Responses

Get tokens as they are generated:
from definable.models import OpenAIChat

model = OpenAIChat(id="gpt-4o")

for chunk in model.invoke_stream(messages=[{"role": "user", "content": "Tell me a story."}]):
    if chunk.content:
        print(chunk.content, end="", flush=True)

What’s Next?