Skip to main content

Setup

export XAI_API_KEY="xai-..."

Basic Usage

from definable.models import xAI

model = xAI(id="grok-beta")
response = model.invoke(messages=[{"role": "user", "content": "Hello!"}])
print(response.content)

Parameters

id
str
default:"grok-beta"
Model identifier. Common values: grok-beta, grok-2.
api_key
str
xAI API key. Defaults to the XAI_API_KEY environment variable.
base_url
str
default:"https://api.x.ai/v1"
xAI API base URL.
temperature
float
Sampling temperature.
max_tokens
int
Maximum output tokens.
search_parameters
dict
Configuration for Grok’s live search capabilities.
xAI Grok models can search the web for real-time information:
model = xAI(
    id="grok-beta",
    search_parameters={"mode": "auto"},
)

response = model.invoke(
    messages=[{"role": "user", "content": "What happened in tech news today?"}]
)
print(response.content)

# Access citations if available
if response.citations:
    for url in response.citations.urls:
        print(f"  Source: {url.url} - {url.title}")

Streaming

for chunk in model.invoke_stream(
    messages=[{"role": "user", "content": "Explain the latest AI developments."}]
):
    if chunk.content:
        print(chunk.content, end="", flush=True)

Async Usage

response = await model.ainvoke(
    messages=[{"role": "user", "content": "Hello!"}]
)
xAI uses an OpenAI-compatible API with extensions for live search. Citations are returned in the response when search is enabled.