Skip to main content

Memori Python SDK

Project description

Memori Labs

Memory from what agents do, not just what they say.

Memori plugs into the software and infrastructure you already use. It is LLM, datastore and framework agnostic and seamlessly integrates into the architecture you've already designed.

Memori Cloud — Zero config. Get an API key and start building in minutes.

Memori%2fLabs%2FMemori | Trendshif

PyPI version NPM version Downloads License Discord

Give a Star

Choose memory that performs

Memori Labs


Getting Started

Installation

TypeScript SDK
npm install @memorilabs/memori
Python SDK
pip install memori

Quickstart

Sign up at app.memorilabs.ai, get a Memori API key, and start building. Full docs: memorilabs.ai/docs/memori-cloud/.

Set MEMORI_API_KEY and your LLM API key (e.g. OPENAI_API_KEY), then:

TypeScript SDK
import { OpenAI } from 'openai';
import { Memori } from '@memorilabs/memori';

// Requires MEMORI_API_KEY and OPENAI_API_KEY in your environment
const client = new OpenAI();
const mem = new Memori().llm
  .register(client)
  .attribution('user_123', 'support_agent');

async function main() {
  await client.chat.completions.create({
    model: 'gpt-4o-mini',
    messages: [{ role: 'user', content: 'My favorite color is blue.' }],
  });
  // Conversations are persisted and recalled automatically in the background.

  const response = await client.chat.completions.create({
    model: 'gpt-4o-mini',
    messages: [{ role: 'user', content: "What's my favorite color?" }],
  });
  // Memori recalls that your favorite color is blue.
}
Python SDK
from memori import Memori
from openai import OpenAI

# Requires MEMORI_API_KEY and OPENAI_API_KEY in your environment
client = OpenAI()
mem = Memori().llm.register(client)

mem.attribution(entity_id="user_123", process_id="support_agent")

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "My favorite color is blue."}]
)
# Conversations are persisted and recalled automatically.

response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "What's my favorite color?"}]
)
# Memori recalls that your favorite color is blue.

Explore the Memories

Use the Dashboard — Memories, Analytics, Playground, and API Keys.

[!TIP] Want to use your own database? Check out docs for Memori BYODB here: https://memorilabs.ai/docs/memori-byodb/.

LoCoMo Benchmark

Memori was evaluated on the LoCoMo benchmark for long-conversation memory and achieved 81.95% overall accuracy while using an average of 1,294 tokens per query. That is just 4.97% of the full-context footprint, showing that structured memory can preserve reasoning quality without forcing large prompts into every request.

Compared with other retrieval-based memory systems, Memori outperformed Zep, LangMem, and Mem0 while reducing prompt size by roughly 67% vs. Zep and lowering context cost by more than 20x vs. full-context prompting.

Read the benchmark overview, see the results, or download the paper.

"Memori's average accuracy along with the standard deviation"

OpenClaw (Persistent Memory for Your Gateway)

By default, OpenClaw agents forget everything between sessions. The Memori plugin fixes that. It captures durable facts and preferences after each conversation, then injects the most relevant context back into future prompts automatically.

No changes to your agent code or prompts are required. The plugin hooks into OpenClaw's lifecycle, so you get structured memory, Intelligent Recall, and Advanced Augmentation with a drop-in plugin.

openclaw plugins install @memorilabs/openclaw-memori
openclaw plugins enable openclaw-memori

openclaw config set plugins.entries.openclaw-memori.config.apiKey "YOUR_MEMORI_API_KEY"
openclaw config set plugins.entries.openclaw-memori.config.entityId "your-app-user-id"

openclaw gateway restart

For setup and configuration, see the OpenClaw Quickstart. For architecture and lifecycle details, see the OpenClaw Overview.

Hermes Agent (Persistent Memory Provider)

Memori also ships as a Hermes Agent memory provider. It captures completed conversations in the background and gives Hermes explicit memori_recall and memori_recall_summary tools for agent-controlled recall.

pip install hermes-memori

hermes config set memory.provider memori
HERMES_HOME="${HERMES_HOME:-$HOME/.hermes}"
mkdir -p "$HERMES_HOME"
echo "MEMORI_API_KEY=YOUR_MEMORI_API_KEY" >> "$HERMES_HOME/.env"
echo "MEMORI_ENTITY_ID=your-app-user-id" >> "$HERMES_HOME/.env"
echo "MEMORI_PROJECT_ID=hermes" >> "$HERMES_HOME/.env"

For setup and configuration, see the Hermes Quickstart. For architecture and lifecycle details, see the Hermes Overview.

MCP (Connect Your Agent in One Command)

Your agent forgets everything between sessions. Memori fixes that. It remembers your stack, your conventions, and how you like things done so you stop repeating yourself.

Works for solo developers and teams. Your agent learns coding patterns, reviewer preferences, and project conventions over time. For teams, that means shared context that new engineers pick up on day one instead of absorbing tribal knowledge over months.

If you use Claude Code, Cursor, Codex, Warp, or Antigravity, you can connect Memori with no SDK integration needed:

claude mcp add --transport http memori https://api.memorilabs.ai/mcp/ \
  --header "X-Memori-API-Key: ${MEMORI_API_KEY}" \
  --header "X-Memori-Entity-Id: your_username" \
  --header "X-Memori-Process-Id: claude-code"

For Cursor, Codex, Warp, and other clients, see the MCP client setup guide.

Attribution

To get the most out of Memori, you want to attribute your LLM interactions to an entity (think person, place or thing; like a user) and a process (think your agent, LLM interaction or program).

If you do not provide any attribution, Memori cannot make memories for you.

TypeScript SDK
mem.attribution("12345", "my-ai-bot");
Python SDK
mem.attribution(entity_id="12345", process_id="my-ai-bot")

Session Management

Memori uses sessions to group your LLM interactions together. For example, if you have an agent that executes multiple steps you want those to be recorded in a single session.

By default, Memori handles setting the session for you but you can start a new session or override the session by executing the following:

TypeScript SDK
mem.resetSession();
// or
mem.setSession(sessionId);
Python SDK
mem.new_session()
# or
mem.set_session(session_id)

Supported LLMs

  • Anthropic
  • Bedrock
  • DeepSeek
  • Gemini
  • Grok (xAI)
  • OpenAI (Chat Completions & Responses API)

(unstreamed, streamed, synchronous and asynchronous)

Supported Frameworks

  • Agno
  • LangChain
  • Pydantic AI

Supported Platforms

  • DeepSeek
  • Nebius AI Studio

Examples

For more examples and demos, check out the Memori Cookbook.

Memori Advanced Augmentation

Memories are tracked at several different levels:

  • entity: think person, place, or thing; like a user
  • process: think your agent, LLM interaction or program
  • session: the current interactions between the entity, process and the LLM

Memori's Advanced Augmentation enhances memories at each of these levels with:

  • attributes
  • events
  • facts
  • people
  • preferences
  • relationships
  • rules
  • skills

Memori knows who your user is, what tasks your agent handles and creates unparalleled context between the two. Augmentation occurs in the background incurring no latency.

By default, Memori Advanced Augmentation is available without an account but rate limited. When you need increased limits, sign up for Memori Advanced Augmentation or use the Memori CLI:

# Install the CLI via pip to manage your account
python -m memori sign-up <email_address>

Memori Advanced Augmentation is always free for developers!

Once you've obtained an API key, set the following environment variable (used by both Python and TypeScript SDKs):

export MEMORI_API_KEY=[api_key]

Managing Your Quota

At any time, you can check your quota using the Memori CLI (works for both SDKs):

python -m memori quota

Or by checking your account at https://app.memorilabs.ai/. If you have reached your IP address quota, sign up and get an API key for increased limits.

If your API key exceeds its quota limits we will email you and let you know.

Command Line Interface (CLI)

The Memori CLI is the unified tool for managing your account, keys, and quotas across all SDKs. To use it, execute the following from the command line:

# Requires Python installed
python -m memori

This will display a menu of the available options. For more information about what you can do with the Memori CLI, please reference Command Line Interface.

Contributing

We welcome contributions from the community! Please see our Contributing Guidelines for details on:

  • Setting up your development environment
  • Code style and standards
  • Submitting pull requests
  • Reporting issues

Support


License

Apache 2.0 - see LICENSE

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memori-3.3.3.tar.gz (186.5 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

memori-3.3.3-cp310-abi3-win_amd64.whl (4.8 MB view details)

Uploaded CPython 3.10+Windows x86-64

memori-3.3.3-cp310-abi3-manylinux_2_28_x86_64.whl (8.0 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.28+ x86-64

memori-3.3.3-cp310-abi3-manylinux_2_28_aarch64.whl (8.5 MB view details)

Uploaded CPython 3.10+manylinux: glibc 2.28+ ARM64

memori-3.3.3-cp310-abi3-macosx_11_0_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.10+macOS 11.0+ x86-64

memori-3.3.3-cp310-abi3-macosx_11_0_arm64.whl (5.2 MB view details)

Uploaded CPython 3.10+macOS 11.0+ ARM64

memori-3.3.3-cp310-abi3-android_24_x86_64.whl (8.4 MB view details)

Uploaded Android API level 24+ x86-64CPython 3.10+

memori-3.3.3-cp310-abi3-android_24_arm64_v8a.whl (8.8 MB view details)

Uploaded Android API level 24+ ARM64 v8aCPython 3.10+

File details

Details for the file memori-3.3.3.tar.gz.

File metadata

  • Download URL: memori-3.3.3.tar.gz
  • Upload date:
  • Size: 186.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for memori-3.3.3.tar.gz
Algorithm Hash digest
SHA256 f9fa07b90581eb0f925a2e1dca5ad21ff8534ed62d0f3e731bb51634014932f0
MD5 f879d0e46f9d94f846300e0c5d2c01e8
BLAKE2b-256 bec568995f180c75b50cf6bf65044a7be88bd0e707fecfad9827b749cea26e3e

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-win_amd64.whl.

File metadata

  • Download URL: memori-3.3.3-cp310-abi3-win_amd64.whl
  • Upload date:
  • Size: 4.8 MB
  • Tags: CPython 3.10+, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for memori-3.3.3-cp310-abi3-win_amd64.whl
Algorithm Hash digest
SHA256 4ae5576c0af77f1a0724752a2d7e68307021a9d7b0b5ac56e813755bda766b62
MD5 297685a56f1848d382cd68344bdc0b34
BLAKE2b-256 0a18a71fd60d52903abfaa5bacdc5d0c09b050ab3f01783c7651283f746a5ddf

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for memori-3.3.3-cp310-abi3-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 f6415477dbe2249e39280c129573ef9d691248ddcea0ccdf82bd4e37e7bfef23
MD5 659c231726fd8d6b59d426d699aa1d4f
BLAKE2b-256 96fa32ee183c4362df6f9519f0621022779841c5af47d0f7ca97ffa1b2917b59

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for memori-3.3.3-cp310-abi3-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 553f858fb21706bf300b676ed4762d170e354633cca744b35afa1506393eaece
MD5 5026b028d6f823a4760920fe86169367
BLAKE2b-256 2528e76ba13f7c23ada59003b549d8adad75ff8d7262c0abcbcae943c2b30923

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for memori-3.3.3-cp310-abi3-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 7dd9c2e5c1401d5b4c1c40d06bde7016ceb762328810b33d9e838b5107a0f08f
MD5 dcf2ee84a137fedfbdc4fa2a599e2307
BLAKE2b-256 c7692313aab31badd73deeefb3d1a32a4a7aa2df1ce0aadaedb6d374e36ff6da

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for memori-3.3.3-cp310-abi3-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 fb7c753705648ffcbe0aa4d6d6e6c4d8474f97a58482a129fb6099cf0cbe5aff
MD5 7922e00df4298f426aa930b5ba8aa843
BLAKE2b-256 d703b6e92b69801042bdcfd9b690c6ae9452590d1e78b9411976bb9167082b4f

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-android_24_x86_64.whl.

File metadata

  • Download URL: memori-3.3.3-cp310-abi3-android_24_x86_64.whl
  • Upload date:
  • Size: 8.4 MB
  • Tags: Android API level 24+ x86-64, CPython 3.10+
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for memori-3.3.3-cp310-abi3-android_24_x86_64.whl
Algorithm Hash digest
SHA256 e832cd8633f7f766f3e46a86c4518a80f4873e1175c686ff125d4f237124cd51
MD5 3bfe4f075fbd5f64e0fda33129e0ebdc
BLAKE2b-256 8ccbbe94089d0d019081aab9910c069f3ea6c79e272628215a5c9d8f6a4febc1

See more details on using hashes here.

File details

Details for the file memori-3.3.3-cp310-abi3-android_24_arm64_v8a.whl.

File metadata

File hashes

Hashes for memori-3.3.3-cp310-abi3-android_24_arm64_v8a.whl
Algorithm Hash digest
SHA256 14deb22803f54014a21ae5e2b7e978be48783e9593c1fb7b8934f2c6c22967c8
MD5 7269792902cab21897f38bd467ed2315
BLAKE2b-256 7d626102abc7da39b555bab4936f9263be57cfb00ac86d7bd3e39cd4729fdfa6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page