Skip to content
@Dakera-AI

Dakera AI

AI AGENT MEMORY PLATFORM
Dakera AI

DAKERA AI

The memory engine for AI agents.
Persistent, searchable, decay-weighted agent memory — built in Rust — as a single self-hosted binary.

Server version MCP version Python SDK version Built in Rust MIT License

dakera.ai  ·  Documentation  ·  Request Early Access →

ذاكرة — Dhākira — Arabic for memory


Why Dakera?

Every AI agent session starts from zero. Thousands of interactions — zero retained knowledge. You're paying to re-teach your agents the same things, every time.

Dakera gives your agents persistent, compounding memory backed by production-grade vector search, hybrid retrieval, knowledge graphs, and built-in ML embeddings — in one self-hosted Rust binary.

Stop managing five services. Deploy one binary.


How It Works

1. Store  →  Your agent writes a memory (content + importance score)
2. Recall →  Query by meaning — hybrid vector + BM25 retrieval returns ranked results
3. Decay  →  Memories auto-decay by access pattern; important ones rise, stale ones fade

Everything runs inside the same process — no sidecars, no embedding APIs, no message queues.


Performance

Metric Value
LoCoMo recall benchmark 87.6%
p99 query latency < 10 ms
Insert throughput 27.4M / second
Binary size ~44 MB
External runtime dependencies 0

Benchmarked against the full 1,540-question LoCoMo conversational recall suite.


What's Inside

Instead of running separately Dakera provides
Qdrant / Pinecone / Weaviate HNSW · IVF · SPFresh vector index
Elasticsearch / OpenSearch BM25 full-text search engine
OpenAI / Cohere embeddings On-device ONNX — zero API calls
Redis / Postgres memory layer Decay-weighted agent memory with sessions
Neo4j Built-in knowledge graph with cross-agent network

Quick Start

# Pull and run
docker run -d -p 3300:3300 -e DAKERA_API_KEY=my-key ghcr.io/dakera-ai/dakera:latest

# Verify
curl http://localhost:3300/health

Python:

pip install dakera
from dakera import DakeraClient

client = DakeraClient(base_url="http://localhost:3300", api_key="my-key")

# Store a memory
client.memories.store(
    agent_id="my-agent",
    content="User prefers TypeScript over Python",
    importance=0.8,
    tags=["preference"]
)

# Recall relevant memories
memories = client.memories.recall(agent_id="my-agent", query="language preferences")

TypeScript:

npm install dakera
import { DakeraClient } from 'dakera';

const client = new DakeraClient({ baseUrl: 'http://localhost:3300', apiKey: 'my-key' });

await client.memories.store({
  agentId: 'my-agent',
  content: 'User prefers TypeScript over Python',
  importance: 0.8,
});

const memories = await client.memories.recall({
  agentId: 'my-agent',
  query: 'language preferences',
});

MCP — 84 Tools for AI Assistants

Add Dakera to Claude, Cursor, or Windsurf in 30 seconds:

Claude Desktop~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "dakera": {
      "command": "dakera-mcp",
      "env": { "DAKERA_API_URL": "http://localhost:3300", "DAKERA_API_KEY": "your-key" }
    }
  }
}

Claude Code.claude/settings.json in your project:

{
  "mcpServers": {
    "dakera": {
      "command": "dakera-mcp",
      "env": { "DAKERA_API_URL": "http://localhost:3300", "DAKERA_API_KEY": "your-key" }
    }
  }
}

84 tools across: Memory CRUD · Vector Operations · Knowledge Graph · Sessions · Namespaces · Decay Engine · AutoPilot · Full-text Index

Full MCP documentation


Open Source Packages

Core SDKs

Package Version Install
dakera-py PyPI pip install dakera
dakera-js npm npm install dakera
dakera-rs crates.io cargo add dakera-client
dakera-go GitHub release go get github.com/dakera-ai/dakera-go
dakera-cli crates.io cargo install dakera-cli
dakera-mcp crates.io bundled with server

Framework Integrations

Package Version Install
dakera-langchain PyPI pip install langchain-dakera
dakera-llamaindex PyPI pip install llamaindex-dakera
dakera-crewai PyPI pip install crewai-dakera
dakera-autogen PyPI pip install autogen-dakera
dakera-langchain-js npm npm install langchain-dakera

All SDKs and integrations are MIT licensed. The core engine and dashboard are proprietary.


Deployment

# Docker (quickest)
docker run -d -p 3300:3300 -p 3500:3500 \
  -e DAKERA_API_KEY=my-key \
  ghcr.io/dakera-ai/dakera:latest

# Helm
helm install dakera oci://ghcr.io/dakera-ai/dakera-helm/dakera \
  --namespace dakera --create-namespace \
  --set dakera.rootApiKey=my-key \
  --set minio.rootPassword=my-password

Full deployment docs


Documentation

dakera.ai/docs — Getting started, MCP setup, SDK references, configuration, deployment, architecture


dakera.ai  ·  Built in Rust 🦀  ·  Request early access →

Popular repositories Loading

  1. dakera-cli dakera-cli Public

    dk — CLI for Dakera self-hosted AI agent memory. Manage memories, namespaces, vectors, knowledge graphs, and sessions.

    Rust 2

  2. dakera-go dakera-go Public

    Go SDK for Dakera AI agent memory — self-hosted, vectors, hybrid search, knowledge graphs, and sessions.

    Go 1

  3. dakera-mcp dakera-mcp Public

    Self-hosted MCP server for AI agent memory — 83 tools, 87.8% LoCoMo. Works with Claude, Cursor, Windsurf.

    Rust 1

  4. dakera-deploy dakera-deploy Public

    Self-hosted Dakera AI memory server — Docker Compose, Kubernetes, Helm, HA cluster setup, and monitoring.

    Dockerfile

  5. dakera-js dakera-js Public

    TypeScript/JavaScript SDK for Dakera AI agent memory — self-hosted, vectors, hybrid search, knowledge graphs, sessions.

    TypeScript

  6. dakera-py dakera-py Public

    Python SDK for Dakera AI agent memory — self-hosted, 87.8% LoCoMo. Vectors, hybrid search, knowledge graphs, sessions.

    Python

Repositories

Showing 10 of 14 repositories

People

This organization has no public members. You must be a member to see who’s a part of this organization.

Top languages

Loading…

Most used topics

Loading…