📢 Announcing our research paper: Mem0 achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we’re revolutionizing AI agent memory.
📢 宣布我们的研究论文:Mem0 的准确率比 OpenAI Memory 高 26%, 延迟降低 91%, 并节省 90% 的代币! 阅读白皮书, 了解我们如何彻底改变 AI 代理内存。

OpenMemory is a local memory infrastructure powered by Mem0 that lets you carry your memory accross any AI app. It provides a unified memory layer that stays with you, enabling agents and assistants to remember what matters across applications.
OpenMemory 是由 Mem0 提供支持的本地内存基础设施,可让您跨任何 AI 应用程序携带内存。它提供了一个统一的内存层,让您能够记住应用程序中的重要内容。

OpenMemory UI

OpenMemory UI

What is the OpenMemory MCP Server什么是 OpenMemory MCP 服务器

The OpenMemory MCP Server is a private, local-first memory server that creates a shared, persistent memory layer for your MCP-compatible tools. This runs entirely on your machine, enabling seamless context handoff across tools. Whether you’re switching between development, planning, or debugging environments, your AI assistants can access relevant memory without needing repeated instructions.
OpenMemory MCP 服务器是一个私有的本地优先内存服务器,可为您的 MCP 兼容工具创建共享的持久内存层。这完全在您的计算机上运行,从而实现跨工具的无缝上下文切换。无论您是在开发、规划还是调试环境之间切换,您的 AI 助手都可以访问相关内存,而无需重复指示。

The OpenMemory MCP Server ensures all memory stays local, structured, and under your control with no cloud sync or external storage.
OpenMemory MCP 服务器确保所有内存都保持在本地、结构化并由您控制,无需云同步或外部存储。

How the OpenMemory MCP Server WorksOpenMemory MCP 服务器的工作原理

Built around the Model Context Protocol (MCP), the OpenMemory MCP Server exposes a standardized set of memory tools:
OpenMemory MCP 服务器围绕模型上下文协议 (MCP) 构建,公开了一组标准化的内存工具:

  • add_memories: Store new memory objects
    add_memories :存储新的内存对象
  • search_memory: Retrieve relevant memories
    search_memory :检索相关回忆
  • list_memories: View all stored memory
    list_memories :查看所有存储的内存
  • delete_all_memories: Clear memory entirely
    delete_all_memories :完全清除内存

Any MCP-compatible tool can connect to the server and use these APIs to persist and access memory.
任何与 MCP 兼容的工具都可以连接到服务器,并使用这些 API 来持久保存和访问内存。

What It Enables 它能做什么

Cross-Client Memory Access跨客户端内存访问

Store context in Cursor and retrieve it later in Claude or Windsurf without repeating yourself.
将上下文存储在 Cursor 中,稍后在 Claude 或 Windsurf 中检索它,而无需重复。

Fully Local Memory Store 完全本地内存存储

All memory is stored on your machine. Nothing goes to the cloud. You maintain full ownership and control.
所有内存都存储在您的计算机上。没有任何东西会进入云。您拥有完全的所有权和控制权。

Unified Memory UI 统一内存 UI

The built-in OpenMemory dashboard provides a central view of everything stored. Add, browse, delete and control memory access to clients directly from the dashboard.
内置的 OpenMemory 仪表板提供了存储的所有内容的中央视图。直接从控制面板添加、浏览、删除和控制对客户端的内存访问。

Supported Clients 支持的客户端

The OpenMemory MCP Server is compatible with any client that supports the Model Context Protocol. This includes:
OpenMemory MCP 服务器与任何支持 Model Context Protocol 的客户端兼容。这包括:

  • Cursor 光标
  • Claude Desktop Claude 桌面
  • Windsurf 风帆冲浪
  • Cline, and more.克莱恩等。

As more AI systems adopt MCP, your private memory becomes more valuable.
随着越来越多的 AI 系统采用 MCP,您的私有内存变得更加有价值。

Real-World Examples 真实示例

Scenario 1: Cross-Tool Project Flow场景 1:跨工具项目流程

Define technical requirements of a project in Claude Desktop. Build in Cursor. Debug issues in Windsurf – all with shared context passed through OpenMemory.
在 Claude Desktop 中定义项目的技术要求。内置光标。调试 Windsurf 中的问题 – 所有问题都使用通过 OpenMemory 传递的共享上下文。

Scenario 2: Preferences That Persist场景 2:持续存在的首选项

Set your preferred code style or tone in one tool. When you switch to another MCP client, it can access those same preferences without redefining them.
在一个工具中设置您喜欢的代码样式或语气。当您切换到另一个 MCP 客户端时,它可以访问这些相同的首选项,而无需重新定义它们。

Scenario 3: Project Knowledge场景 3:项目知识

Save important project details once, then access them from any compatible AI tool, no more repetitive explanations.
只需保存一次重要的项目详细信息,然后从任何兼容的 AI 工具访问它们,无需再重复解释。

Conclusion 结论

The OpenMemory MCP Server brings memory to MCP-compatible tools without giving up control or privacy. It solves a foundational limitation in modern LLM workflows: the loss of context across tools, sessions, and environments.
OpenMemory MCP 服务器在不放弃控制或隐私的情况下为兼容 MCP 的工具提供内存。它解决了现代 LLM 工作流程中的一个基本限制:跨工具、会话和环境的上下文丢失。

By standardizing memory operations and keeping all data local, it reduces token overhead, improves performance, and unlocks more intelligent interactions across the growing ecosystem of AI assistants.
通过标准化内存作并将所有数据保留在本地,它减少了令牌开销,提高了性能,并在不断增长的 AI 助手生态系统中解锁了更智能的交互。

This is just the beginning. The MCP server is the first core layer in the OpenMemory platform – a broader effort to make memory portable, private, and interoperable across AI systems.
这仅仅是个开始。MCP 服务器是 OpenMemory 平台中的第一个核心层,旨在实现内存的可移植性、私有性和跨 AI 系统互作性。

Getting Started Today 今天开始

With OpenMemory, your AI memories stay private, portable, and under your control, exactly where they belong.
使用 OpenMemory,您的 AI 内存可以保持私密、便携,并由您控制,确切地放在它们所属的位置。

OpenMemory: Your memories, your control.
OpenMemory:您的记忆,您的控制权。

Contributing 贡献

OpenMemory is open source and we welcome contributions. Please see the CONTRIBUTING.md file for more information.
OpenMemory 是开源的,我们欢迎贡献。有关更多信息,请参阅 CONTRIBUTING.md 文件。

1

Avatar photo

By YXI.AI

Leave a Reply

Your email address will not be published. Required fields are marked *