Skip to main content

Setup

export DEEPSEEK_API_KEY="sk-..."

Basic Usage

from definable.models import DeepSeekChat

model = DeepSeekChat(id="deepseek-chat")
response = model.invoke(messages=[{"role": "user", "content": "Hello!"}])
print(response.content)

Parameters

id
str
default:"deepseek-chat"
Model identifier. Common values: deepseek-chat, deepseek-reasoner.
api_key
str
DeepSeek API key. Defaults to the DEEPSEEK_API_KEY environment variable.
base_url
str
default:"https://api.deepseek.com"
DeepSeek API base URL.
temperature
float
Sampling temperature.
max_tokens
int
Maximum output tokens.

Reasoning Support

DeepSeek models can produce reasoning content alongside their response:
model = DeepSeekChat(id="deepseek-reasoner")
response = model.invoke(
    messages=[{"role": "user", "content": "Solve: what is 15! / 13!?"}]
)
print(response.reasoning_content)  # Step-by-step reasoning
print(response.content)            # Final answer

Streaming

for chunk in model.invoke_stream(
    messages=[{"role": "user", "content": "Explain quantum computing."}]
):
    if chunk.content:
        print(chunk.content, end="", flush=True)

Async Usage

response = await model.ainvoke(
    messages=[{"role": "user", "content": "Hello!"}]
)
DeepSeek does not support native structured outputs (supports_native_structured_outputs = False). Structured output requests are handled via prompt engineering with JSON Schema instructions.