Skip to main content

Setup

export XAI_API_KEY="xai-..."

Basic Usage

from definable.model import xAI
from definable.model.message import Message

model = xAI(id="grok-3")
response = model.invoke(
    messages=[Message(role="user", content="Hello!")],
    assistant_message=Message(role="assistant", content=""),
)
print(response.content)

Parameters

id
str
default:"grok-3"
Model identifier. Common values: grok-3, grok-2.
api_key
str
xAI API key. Defaults to the XAI_API_KEY environment variable.
base_url
str
default:"https://api.x.ai/v1"
xAI API base URL.
temperature
float
Sampling temperature.
max_tokens
int
Maximum output tokens.
search_parameters
dict
Configuration for Grok’s live search capabilities.
xAI Grok models can search the web for real-time information:
from definable.model.message import Message

model = xAI(
    id="grok-3",
    search_parameters={"mode": "auto"},
)

response = model.invoke(
    messages=[Message(role="user", content="What happened in tech news today?")],
    assistant_message=Message(role="assistant", content=""),
)
print(response.content)

# Access citations if available
if response.citations:
    for url in response.citations.urls:
        print(f"  Source: {url.url} - {url.title}")

Streaming

from definable.model.message import Message

for chunk in model.invoke_stream(
    messages=[Message(role="user", content="Explain the latest AI developments.")],
    assistant_message=Message(role="assistant", content=""),
):
    if chunk.content:
        print(chunk.content, end="", flush=True)

Async Usage

from definable.model.message import Message

response = await model.ainvoke(
    messages=[Message(role="user", content="Hello!")],
    assistant_message=Message(role="assistant", content=""),
)
xAI uses an OpenAI-compatible API with extensions for live search. Citations are returned in the response when search is enabled.