Vector Memory Overview
OpenClaw's Memory system gives the AI assistant long-term memory that extends beyond a single conversation's context window. It uses a vector database for storing and retrieving information with semantic search.
Enable the Memory System
openclaw memory enable
Configure vector storage:
{
"memory": {
"enabled": true,
"provider": "local",
"embeddingModel": "text-embedding-3-small",
"embeddingProvider": "openai",
"maxMemories": 10000,
"searchResults": 5
}
}
Manually Add Memories
# Add text memories
openclaw memory add "The company was founded in 2020, headquartered in Beijing"
openclaw memory add "John is the head of engineering, specializing in Python"
# Import from a file
openclaw memory import knowledge.txt
openclaw memory import faq.md --format markdown
# Bulk import from a directory
openclaw memory import ./docs/ --recursive
Search Memories
openclaw memory search "When was the company founded"
Search results (top 3):
1. [0.92] The company was founded in 2020, headquartered in Beijing
Source: manual Added: 2026-03-15
2. [0.78] The company raised Series A funding in 2021
Source: docs/history.md Added: 2026-03-15
3. [0.65] The company currently has 200 employees
Source: docs/about.md Added: 2026-03-15
The number in brackets is the similarity score (0-1).
View Memory Statistics
openclaw memory stats
Memory Statistics:
Total memories: 1,520
Storage size: 25MB
Embedding model: text-embedding-3-small
Index status: up-to-date
Last updated: 10 minutes ago
By source:
manual: 50
docs/: 1,200
conversations: 270
Manage Memories
# List recent memories
openclaw memory list --last 20
# Delete a specific memory
openclaw memory delete mem_abc123
# Clear all memories
openclaw memory clear --confirm
# Clear by source
openclaw memory clear --source "docs/"
Auto-Capture
Configure the AI to automatically extract and save important information from conversations:
{
"memory": {
"autoCapture": {
"enabled": true,
"threshold": 0.8,
"categories": ["fact", "preference", "instruction"],
"maxPerSession": 10
}
}
}
Using in AI Conversations
Enable memory tools for a channel:
{
"channels": {
"telegram-main": {
"tools": ["memory_search", "memory_add"]
}
}
}
Example conversation:
User: Remember that I like coffee
AI: Got it, I've noted that you like coffee.
[A few days later]
User: What drink would you recommend for me?
AI: Based on my memory, you like coffee, so I'd recommend trying a latte or an americano.
Custom Embedding Models
Use a local embedding model:
{
"memory": {
"embeddingProvider": "ollama",
"embeddingModel": "nomic-embed-text",
"embeddingBaseUrl": "http://localhost:11434"
}
}
Backup and Restore
# Back up memory data
openclaw memory export --output memory-backup.json
# Restore memory data
openclaw memory import --input memory-backup.json
Summary
The vector memory system gives OpenClaw's AI assistant long-term memory, allowing it to remember user preferences, company knowledge, and key information from past conversations. By combining auto-capture with manual knowledge import, you can build an AI assistant that gets smarter the more you use it.