![]() |
New open-source library lets developers give any AI agent long-term memory that humans can read, edit, and trust—extracted from OpenClaw’s viral memory system and powered by the Milvus vector database.
REDWOOD CITY, Calif., March 12, 2026 /PRNewswire/ — Zilliz, the company behind Milvus, the world’s most widely adopted open-source vector database, today announced the open-source release of memsearch, a lightweight library that gives AI agents persistent, long-term memory across conversations. Available today under the MIT license, memsearch was engineered by Zilliz’s team by extracting and rebuilding the memory subsystem originally developed for OpenClaw—the autonomous AI agent that captured 189,000+ GitHub stars in under two weeks—turning it into a reusable, standalone tool any developer can adopt today.
The AI agent ecosystem has a memory problem. When a conversation ends, most agents forget everything—context, preferences, prior decisions—forcing users to repeat themselves session after session. Existing solutions that attempt to address this typically store information in opaque, proprietary formats that only the AI can access, leaving developers unable to inspect, correct, or migrate what their agent has learned.
Memsearch takes a fundamentally different approach. Rather than abstracting memory behind an inaccessible black box, it stores all agent memories as plain-text files—readable, editable, and version-controllable by any developer. Milvus then automatically indexes those files, enabling fast, accurate semantic retrieval at scale.
“Memory is the missing layer in the AI agent stack. Developers deserve to know what their agents remember, fix it when it’s wrong, and carry it forward without lock-in. memsearch is our answer to that—transparent, portable, and built on the open-source foundation the community already trusts.” — Jiang Chen, Head of Developer Relations, Zilliz
Why Memsearch — Key Benefits for Developers
- Full transparency: Every memory is a human-readable text file. Developers can see exactly what their AI agent knows—no special tools, no proprietary dashboards required.
- Easy correction: Fix a bad memory by editing a file. memsearch detects the change automatically—no retraining, no data pipelines, no manual re-ingestion.
- Team collaboration: Because memory lives in standard files, teams can apply familiar version-control workflows—Git commits, pull request reviews, rollbacks—to what their agent remembers.
- True portability: Switching machines, AI models, or cloud providers requires nothing more than copying files. There is no proprietary export step, no migration script, no vendor permission.
- Plug-and-play integration: memsearch works with any AI agent framework. A single command installs it; no infrastructure changes are required.
Dedicated Plugin for Claude Code
Alongside memsearch, Zilliz is releasing the memsearch ccplugin—a purpose-built persistent memory plugin for Claude Code, Anthropic’s AI-powered coding assistant.
AI-assisted coding sessions accumulate significant context over time: architectural decisions, debugging history, project conventions, and team preferences. Today, that context evaporates when a session ends. The memsearch ccplugin automatically captures session summaries and injects relevant prior context at the start of every new session—so Claude Code arrives informed, not amnesiac.
Installation takes a single command.
For full technical detail on the memsearch architecture and the Claude Code plugin, see the Zilliz engineering blog.
Availability
memsearch is free and open source (MIT license) and is available immediately.
- GitHub: github.com/zilliztech/memsearch
- Documentation: zilliztech.github.io/memsearch
- Claude Code Plugin: zilliztech.github.io/memsearch/claude-plugin
About Zilliz
Zilliz is the company behind Milvus, the world’s most widely adopted open-source vector database. Zilliz Cloud brings that performance to production with a fully managed, cloud-native platform built for scalable, low-latency vector search and hybrid retrieval. It supports billion-scale workloads with sub-10ms latency, auto-scaling, and optimized indexes for GenAI use cases like semantic search and RAG.
Zilliz is built to make AI not just possible—but practical. With a focus on performance and cost-efficiency, it helps engineering teams move from prototype to production without overprovisioning or complex infrastructure. Over 10,000 organizations worldwide rely on Zilliz to build intelligent applications at scale.
Headquartered in Redwood Shores, California, Zilliz is backed by leading investors, including Aramco’s Prosperity 7 Ventures, Temasek’s Pavilion Capital, Hillhouse Capital, 5Y Capital, Yunqi Partners, Trustbridge Partners, and others. Learn more at Zilliz.com.
