Setup
export DEEPSEEK_API_KEY="sk-..."
Basic Usage
from definable.model import DeepSeekChat
from definable.model.message import Message
model = DeepSeekChat(id="deepseek-chat")
response = model.invoke(
messages=[Message(role="user", content="Hello!")],
assistant_message=Message(role="assistant", content=""),
)
print(response.content)
Parameters
id
str
default:"deepseek-chat"
Model identifier. Common values: deepseek-chat, deepseek-reasoner.
DeepSeek API key. Defaults to the DEEPSEEK_API_KEY environment variable.
base_url
str
default:"https://api.deepseek.com"
DeepSeek API base URL.
Reasoning Support
DeepSeek models can produce reasoning content alongside their response:
from definable.model.message import Message
model = DeepSeekChat(id="deepseek-reasoner")
response = model.invoke(
messages=[Message(role="user", content="Solve: what is 15! / 13!?")],
assistant_message=Message(role="assistant", content=""),
)
print(response.reasoning_content) # Step-by-step reasoning
print(response.content) # Final answer
Streaming
from definable.model.message import Message
for chunk in model.invoke_stream(
messages=[Message(role="user", content="Explain quantum computing.")],
assistant_message=Message(role="assistant", content=""),
):
if chunk.content:
print(chunk.content, end="", flush=True)
Async Usage
from definable.model.message import Message
response = await model.ainvoke(
messages=[Message(role="user", content="Hello!")],
assistant_message=Message(role="assistant", content=""),
)
DeepSeek does not support native structured outputs (supports_native_structured_outputs = False). Structured output requests are handled via prompt engineering with JSON Schema instructions.