Skip to main content
A MemoryStore is the storage backend for CognitiveMemory. Definable provides eight built-in implementations, from ephemeral in-memory stores for testing to production-grade vector databases.

Available Backends

BackendImportDependencyBest For
InMemoryStoredefinable.memoryNoneTesting, ephemeral sessions
SQLiteMemoryStoredefinable.memoryaiosqlite (core)Local dev, single-process
PostgresMemoryStoredefinable.memoryasyncpgProduction, multi-process
RedisMemoryStoredefinable.memoryredisHigh-throughput caching
ChromaMemoryStoredefinable.memorychromadbVector-native storage
QdrantMemoryStoredefinable.memoryqdrant-clientVector-native at scale
PineconeMemoryStoredefinable.memorypineconeManaged vector DB
MongoMemoryStoredefinable.memorymotorDocument-oriented

Installation

Core stores (InMemory, SQLite) require no extra dependencies. For others, install the matching extra:
pip install 'definable[postgres-memory]'   # asyncpg
pip install 'definable[redis-memory]'       # redis
pip install 'definable[qdrant-memory]'      # qdrant-client
pip install 'definable[chroma-memory]'      # chromadb
pip install 'definable[mongodb-memory]'     # motor
pip install 'definable[pinecone-memory]'    # pinecone

Backend Examples

SQLiteMemoryStore

The default choice for local development. Uses aiosqlite (included as a core dependency).
from definable.memory import CognitiveMemory, SQLiteMemoryStore

store = SQLiteMemoryStore(db_path="./memory.db")
memory = CognitiveMemory(store=store)
db_path
str
default:"./memory.db"
Path to the SQLite database file. Created automatically if it doesn’t exist.

PostgresMemoryStore

Production-ready backend with connection pooling. Requires a running PostgreSQL instance.
from definable.memory import CognitiveMemory
from definable.memory.store.postgres import PostgresMemoryStore

store = PostgresMemoryStore(
  db_url="postgresql://user:pass@localhost:5432/mydb",
  pool_size=5,
  table_prefix="memory_",
)
memory = CognitiveMemory(store=store)
db_url
str
default:""
PostgreSQL connection URL.
pool_size
int
default:5
Connection pool size.
table_prefix
str
default:"memory_"
Prefix for database table names.

ChromaMemoryStore

Vector-native storage using ChromaDB. Supports both in-memory and persistent modes.
from definable.memory import CognitiveMemory
from definable.memory.store.chroma import ChromaMemoryStore

store = ChromaMemoryStore(
  persist_directory="./chroma_data",
  collection_prefix="memory_",
)
memory = CognitiveMemory(store=store)
persist_directory
str
default:"None"
Directory for persistent storage. When None, uses in-memory mode.
collection_prefix
str
default:"memory_"
Prefix for ChromaDB collection names.

QdrantMemoryStore

High-performance vector database for production workloads.
from definable.memory import CognitiveMemory
from definable.memory.store.qdrant import QdrantMemoryStore

store = QdrantMemoryStore(
  url="localhost",
  port=6333,
  api_key=None,
  prefix="memory",
  vector_size=1536,
)
memory = CognitiveMemory(store=store)
url
str
default:"localhost"
Qdrant server URL.
port
int
default:6333
Qdrant server port.
api_key
str
default:"None"
API key for Qdrant Cloud.
prefix
str
default:"memory"
Prefix for collection names.
vector_size
int
default:1536
Dimension of embedding vectors.

PineconeMemoryStore

Managed vector database — no infrastructure to maintain.
from definable.memory import CognitiveMemory
from definable.memory.store.pinecone import PineconeMemoryStore

store = PineconeMemoryStore(
  api_key="your-pinecone-api-key",
  index_name="memory",
  vector_size=1536,
)
memory = CognitiveMemory(store=store)
api_key
str
default:""
Pinecone API key.
index_name
str
default:"memory"
Name of the Pinecone index.
environment
str
default:"None"
Pinecone environment (e.g., us-east-1-aws).
vector_size
int
default:1536
Dimension of embedding vectors.

InMemoryStore

Ephemeral store for testing. Data is lost when the process exits.
from definable.memory import CognitiveMemory, InMemoryStore

store = InMemoryStore()
memory = CognitiveMemory(store=store)

MemoryStore Protocol

All backends implement the MemoryStore protocol. You can create custom stores by implementing these methods:
from definable.memory import MemoryStore, Episode, KnowledgeAtom, Procedure, TopicTransition

class MyStore(MemoryStore):
    # Lifecycle
    async def initialize(self) -> None: ...
    async def close(self) -> None: ...

    # Episodes
    async def store_episode(self, episode: Episode) -> str: ...
    async def get_episodes(self, *, user_id=None, session_id=None, limit=50,
                           min_stage=None, max_stage=None) -> list[Episode]: ...
    async def update_episode(self, episode_id: str, **fields) -> None: ...
    async def get_episodes_for_distillation(self, stage: int, older_than: float) -> list[Episode]: ...

    # Knowledge Atoms
    async def store_atom(self, atom: KnowledgeAtom) -> str: ...
    async def get_atoms(self, *, user_id=None, min_confidence=0.1, limit=50) -> list[KnowledgeAtom]: ...
    async def find_similar_atom(self, subject: str, predicate: str, user_id=None) -> KnowledgeAtom | None: ...
    async def update_atom(self, atom_id: str, **fields) -> None: ...
    async def prune_atoms(self, min_confidence: float) -> int: ...

    # Procedures
    async def store_procedure(self, procedure: Procedure) -> str: ...
    async def get_procedures(self, *, user_id=None, min_confidence=0.3) -> list[Procedure]: ...
    async def find_similar_procedure(self, trigger: str, user_id=None) -> Procedure | None: ...
    async def update_procedure(self, procedure_id: str, **fields) -> None: ...

    # Topic Transitions
    async def store_topic_transition(self, from_topic: str, to_topic: str, user_id=None) -> None: ...
    async def get_topic_transitions(self, from_topic: str, user_id=None, min_count=3) -> list[TopicTransition]: ...

    # Vector Search
    async def search_episodes_by_embedding(self, embedding: list[float], *, user_id=None, top_k=20) -> list[Episode]: ...
    async def search_atoms_by_embedding(self, embedding: list[float], *, user_id=None, top_k=20) -> list[KnowledgeAtom]: ...

    # Deletion
    async def delete_user_data(self, user_id: str) -> None: ...
    async def delete_session_data(self, session_id: str) -> None: ...

Data Types

Episode

A single conversation turn stored in memory.
FieldTypeDefaultDescription
idstrUnique identifier
user_idstr | NoneOwner user ID
session_idstrConversation session
rolestr"user" or "assistant"
contentstrMessage text
embeddinglist[float] | NoneNoneVector embedding
topicslist[str][]Extracted topics
sentimentfloat0.0Sentiment score (-1.0 to 1.0)
token_countint0Token count of content
compression_stageint00=raw, 1=summary, 2=facts, 3=atoms
created_atfloat0.0Unix timestamp
last_accessed_atfloat0.0Last retrieval timestamp
access_countint0Number of times retrieved

KnowledgeAtom

An extracted fact stored as a subject-predicate-object triple.
FieldTypeDefaultDescription
idstrUnique identifier
user_idstr | NoneOwner user ID
subjectstrEntity (e.g., “Alice”)
predicatestrRelation (e.g., “works-at”)
objectstrValue (e.g., “Acme Corp”)
contentstrFull text representation
embeddinglist[float] | NoneNoneVector embedding
confidencefloat1.0Confidence score (0.0–1.0)
reinforcement_countint0Times this fact was reinforced
topicslist[str][]Related topics
source_episode_idslist[str][]Episodes this fact was extracted from

Procedure

A learned behavioral pattern.
FieldTypeDefaultDescription
idstrUnique identifier
user_idstr | NoneOwner user ID
triggerstrCondition that activates this procedure
actionstrAction to take
confidencefloat0.5Confidence score
observation_countint1Times observed