Name | marm-mcp-server JSON |
Version |
2.2.3
JSON |
| download |
home_page | None |
Summary | MARM-Systems is a complete protocol and platform—combining an advanced memory backend, modular semantic search, and agent-to-agent coordination with a scientifically structured, community-vetted methodology for reasoning, session recall, and collaborative AI workflows. More then just a set of tools, it's a complete AI memory ecosystem |
upload_time | 2025-09-19 05:50:36 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | MIT |
keywords |
mcp
ai
memory
claude
assistant
protocol
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<div align="center">
<picture>
<img src="https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/media/marm-main.jpg"
alt="MARM - The AI That Remembers Your Conversations"
width="700"
height="350"
</picture>
<h1 align="center">MARM: The AI That Remembers Your Conversations</h1>
Memory Accurate Response Mode v2.2.3 - The intelligent memory system for AI agents. Stop losing context. Stop hallucinations. Start controlling your LLM conversations.
  [](https://github.com/MARM-Systems/MARM) [](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/LICENSE) [](https://python.org) [](https://fastapi.tiangolo.com) [](https://hub.docker.com/r/lyellr88/marm-mcp-server)
[](https://github.com/Lyellr88/MARM-Systems)
**Note:** This is the *official* MARM repository. All official versions and releases are managed here.
Forks may experiment, but official updates will always come from this repo.
</div>
---
## ⚡ **INSTANT SETUP** - Ready in 60 seconds
**Already have MARM running?** Connect instantly:
**Claude Code users:**
```bash
/mcp # → Instant connection to your MARM server!
```
**Claude Desktop users:**
```json
// Add to your MCP settings:
{
"mcpServers": {
"marm-memory": {
"command": "docker",
"args": ["exec", "marm-mcp-server", "python", "/app/server.py"]
}
}
}
```
**✅ 19 memory tools loaded ✅ Persistent sessions ✅ Semantic search**
**Don't have MARM yet?** Install + Connect:
```bash
# 1. Pull & Start (30 seconds)
docker run -d --name marm-mcp-server -p 8001:8001 lyellr88/marm-mcp-server:latest
# 2. Connect to Claude (5 seconds)
/mcp add marm-memory http://localhost:8001/mcp
# 3. Activate (instant)
/mcp
```
**🎯 You now have AI with perfect memory across all conversations!**
---
## 🚀 Full Installation Guide
**Docker (Fastest - 30 seconds):**
```bash
docker pull lyellr88/marm-mcp-server:latest
docker run -d --name marm-mcp-server -p 8001:8001 lyellr88/marm-mcp-server:latest
claude mcp add marm-memory http://localhost:8001/mcp
```
**Quick Local Install:**
```bash
pip install marm-mcp-server==2.2.3
cd MARM-Systems/marm-mcp-server
# Cross-platform: pip install marm-mcp-server==2.2.3
claude mcp add marm-memory http://localhost:8001/mcp
```
**Key Information:**
- **Server Endpoint**: `http://localhost:8001/mcp`
- **API Documentation**: `http://localhost:8001/docs`
- **Supported Clients**: Claude Code, Qwen CLI, Gemini CLI, and any MCP-compatible LLM client or LLM platform
**All Installation Options:**
- **Docker** (Fastest): One command, works everywhere
- **Automated Setup**: One command with dependency validation
- **Manual Installation**: Step-by-step with virtual environment
- **Quick Test**: Zero-configuration trial run
**Choose your installation method:**
| Installation Type | Guide | Best For |
|-------------------|-------|----------|
| **Docker** | **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** | Cross-platform, production deployment |
| **Windows** | **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** | Native Windows development |
| **Linux** | **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** | Native Linux development |
| **Platforms** | **[INSTALL-PLATFORM.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORM.md)** | App & API integration |
---
## 🎯 Why MARM?
MARM (Memory Accurate Response Mode) is a comprehensive AI memory ecosystem I designed to solve the problem of context loss in large language models. What started as a simple protocol has evolved into a suite of tools that provide a persistent, intelligent, and cross-platform memory for any AI agent.
The MARM ecosystem consists of three main components:
- **The MARM Protocol:** A set of rules and commands for structured, reliable AI interaction.
- **The MARM Universal MCP Server:** A production-ready memory intelligence platform that provides a powerful, stateful backend for any MCP-compatible AI client.
- **The MARM Chatbot:** A web-based interface for interacting with the MARM protocol directly.
Whether you're a developer looking to build the next generation of AI agents, a researcher studying AI behavior, or simply a power user who wants to have more productive conversations with your AI, the MARM ecosystem provides the tools you need to unlock the full potential of large language models.
<div align="center">
<picture>
<img src="https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/media/google-overview.PNG"
alt="MARM - The AI That Remembers Your Conversations"
width="700"
height="350"
</picture>
</div>
<p align="center">*Appears in Google AI Overview for AI memory protocol queries (as of Aug 2025)*
The newest addition to the ecosystem is MARM MCP it represents an emerging category of MCP server that integrates a complete protocol layer with intelligent memory systems. Built on FastAPI and SQLite, it combines the MARM protocol with semantic search, session management, and smart retrieval to bridge tool access with structured reasoning. This creates a more consistent, user-controlled LLM experience that goes beyond simple tool exposure.
| **Category** | **Feature** | **Description** |
|--------------|-------------|-----------------|
| **🧠 Memory** | **Semantic Search** | Find memories by meaning using AI embeddings, not keyword matching |
| | **Auto-Classification** | Content intelligently categorized (code, project, book, general) |
| | **Cross-Session Memory** | Memories survive across different AI agent conversations |
| | **Smart Recall** | Vector similarity search with context-aware intelligent fallbacks |
| **🤝 Multi-AI** | **Unified Memory Layer** | Accessible by any connected LLM (Claude, Qwen, Gemini, etc.) |
| | **Cross-Platform Intelligence** | Different AI agents learn from each other's interactions |
| | **User-Controlled Memory** | Granular control over memory sharing and "Bring Your Own History" |
| **🏗️ Architecture** | **19 Complete MCP Tools** | Full Model Context Protocol implementation |
| | **Database Optimization** | SQLite with WAL mode and connection pooling |
| | **Rate Limiting** | IP-based protection for sustainable free service |
| | **MCP Compliance** | Response size management for optimal performance |
| | **Docker Ready** | Containerized deployment with health monitoring |
| **⚡ Advanced** | **Usage Analytics** | Privacy-conscious insights for platform optimization |
| | **Event-Driven System** | Self-managing architecture with comprehensive error isolation |
| | **Structured Logging** | Development and debugging support with `structlog` |
| | **Health Monitoring** | Real-time system status and performance tracking |
---
## Why I Built MARM
MARM started with my own frustrations: AI losing context, repeating itself, and drifting off track. But I didn’t stop there. I asked a simple question in a few AI subreddits:
*“What’s the one thing you wish your LLM could do better?”*
The replies echoed the same pain points:
- Keep memory accurate
- Give users more control
- Be transparent, not a black box
That feedback confirmed the gap I already saw. I took those shared frustrations, found the middle ground, and built MARM. Early contributors validated the idea and shaped features, but the core system grew out of both personal trial and community insight.
MARM is the result of combining individual persistence with collective needs, a protocol designed to solve what we all kept running into.
### Discord
Join Discord for upcoming features and builds, plus a safe space to share your work and get constructive feedback.
[MARM Discord](https://discord.gg/EuBsHvSRks)
---
## Before MARM vs After MARM
**Without MARM:**
- "Wait, what were we discussing about the database schema?"
- AI repeats previous suggestions you already rejected
- Loses track of project requirements mid-conversation
- Starts from scratch every time you return
**With MARM:**
- AI references your logged project notes and decisions
- Maintains context across multiple sessions
- Builds on previous discussions instead of starting over
- Remembers what works and what doesn't for your project
---
## Why Use MARM?
Modern LLMs often lose context or fabricate information. MARM introduces a session memory kernel, structured logs, and a user-controlled knowledge library. Anchoring the AI to *your* logic and data. It’s more than a chatbot wrapper. It’s a methodology for accountable AI.
### Command Overview
| **Category** | **Command** | **Function** |
|--------------|-------------|--------------|
| **Session** | `/start marm` | Activate protocol |
| | `/refresh marm` | Reaffirm/reset context |
| **Core** | `/log` | Start structured session logging |
| | `/notebook` | Store key data |
| | `/summary:` | Summarize and reseed sessions |
| **Advanced** | `/deep dive` | Request context-aware response |
| | `/show reasoning` | Reveal logic trail of last answer |
Need a walkthrough or troubleshooting help? The [`MARM-HANDBOOK.md`](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MARM-HANDBOOK.md) covers all aspects of using MARM.
---
# 🛠️ MARM MCP Server Guide
Now that you understand the ecosystem, here's info and how to actually use the MCP server with your AI agents
---
## 🛠️ Complete MCP Tool Suite (19 Tools)
| **Category** | **Tool** | **Description** |
|--------------|----------|-----------------|
| **🧠 Memory Intelligence** | `marm_smart_recall` | AI-powered semantic similarity search across all memories. Supports global search with `search_all=True` flag |
| | `marm_contextual_log` | Intelligent auto-classifying memory storage using vector embeddings |
| **🚀 Session Management** | `marm_start` | Activate MARM intelligent memory and response accuracy layers |
| | `marm_refresh` | Refresh AI agent session state and reaffirm protocol adherence |
| **📚 Logging System** | `marm_log_session` | Create or switch to named session container |
| | `marm_log_entry` | Add structured log entry with auto-date formatting |
| | `marm_log_show` | Display all entries and sessions (filterable) |
| | `marm_log_delete` | Delete specified session or individual entries |
| **🔄 Reasoning & Workflow** | `marm_summary` | Generate context-aware summaries with intelligent truncation for LLM conversations |
| | `marm_context_bridge` | Smart context bridging for seamless AI agent workflow transitions |
| **📔 Notebook Management** | `marm_notebook_add` | Add new notebook entry with semantic embeddings |
| | `marm_notebook_use` | Activate entries as instructions (comma-separated) |
| | `marm_notebook_show` | Display all saved keys and summaries |
| | `marm_notebook_delete` | Delete specific notebook entry |
| | `marm_notebook_clear` | Clear the active instruction list |
| | `marm_notebook_status` | Show current active instruction list |
| **⚙️ System Utilities** | `marm_current_context` | Get current date/time for accurate log entry timestamps |
| | `marm_system_info` | Comprehensive system information, health status, and loaded docs |
| | `marm_reload_docs` | Reload documentation into memory system |
---
## 🏗️ Architecture Overview
### **Core Technology Stack**
```txt
FastAPI (0.115.4) + FastAPI-MCP (0.4.0) - v2.2.3
├── SQLite with WAL Mode + Custom Connection Pooling
├── Sentence Transformers (all-MiniLM-L6-v2) + Semantic Search
├── Structured Logging (structlog) + Memory Monitoring (psutil)
├── IP-Based Rate Limiting + Usage Analytics
├── MCP Response Size Compliance (1MB limit)
├── Event-Driven Automation System
├── Docker Containerized Deployment + Health Monitoring
└── Advanced Memory Intelligence + Auto-Classification
```
### **Database Schema (5 Tables)**
#### `memories` - Core Memory Storage
```sql
CREATE TABLE memories (
id TEXT PRIMARY KEY,
session_name TEXT NOT NULL,
content TEXT NOT NULL,
embedding BLOB, -- AI vector embeddings for semantic search
timestamp TEXT NOT NULL,
context_type TEXT DEFAULT 'general', -- Auto-classified content type
metadata TEXT DEFAULT '{}',
created_at TEXT DEFAULT CURRENT_TIMESTAMP
);
```
#### `sessions` - Session Management
```sql
CREATE TABLE sessions (
session_name TEXT PRIMARY KEY,
marm_active BOOLEAN DEFAULT FALSE,
created_at TEXT DEFAULT CURRENT_TIMESTAMP,
last_accessed TEXT DEFAULT CURRENT_TIMESTAMP,
metadata TEXT DEFAULT '{}'
);
```
#### Plus: `log_entries`, `notebook_entries`, `user_settings`
---
## 📈 Performance & Scalability
### **Production Optimizations**
- **Custom SQLite Connection Pool**: Thread-safe with configurable limits (default: 5)
- **WAL Mode**: Write-Ahead Logging for concurrent access performance
- **Lazy Loading**: Semantic models loaded only when needed (resource efficient)
- **Intelligent Caching**: Memory usage optimization with cleanup cycles
- **Response Size Management**: MCP 1MB compliance with smart truncation
### **Rate Limiting Tiers**
- **Default**: 60 requests/minute, 5min cooldown
- **Memory Heavy**: 20 requests/minute, 10min cooldown (semantic search)
- **Search Operations**: 30 requests/minute, 5min cooldown
---
## 📚 Documentation for MCP
| Guide Type | Document | Description |
|------------|----------|-------------|
| **Docker Setup** | **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** | Cross-platform, production deployment |
| **Windows Setup** | **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** | Native Windows development |
| **Linux Setup** | **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** | Native Linux development |
| **Platform Integration** | **[INSTALL-PLATFORM.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORM.md)** | App & API integration |
| **MCP Handbook** | **[MCP-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MCP-HANDBOOK.md)** | Complete usage guide with all 19 MCP tools, cross-app memory strategies, pro tips, and FAQ |
---
## 🆚 Competitive Advantage
### **vs. Basic MCP Implementations**
| Feature | MARM v2.2.3 | Basic MCP Servers |
|---------|-------------|-------------------|
| **Memory Intelligence** | AI-powered semantic search with auto-classification | Basic key-value storage |
| **Tool Coverage** | 19 complete MCP protocol tools | 3-5 basic wrappers |
| **Scalability** | Database optimization + connection pooling | Single connection |
| **MCP Compliance** | 1MB response size management | No size controls |
| **Deployment** | Docker containerization + health monitoring | Local development only |
| **Analytics** | Usage tracking + business intelligence | No tracking |
| **Codebase Maturity** | 2,500+ lines professional code | 200-800 lines |
---
## 🤝 Contributing
**Aren't you sick of explaining every project you're working on to every LLM you work with?**
MARM is building the solution to this. Support now to join a growing ecosystem - this is just **Phase 1 of a 3-part roadmap** and our next build will complement MARM like peanut butter and jelly.
**Join the repo that's working to give YOU control over what is remembered and how it's remembered.**
### **Why Contribute Now?**
- **Ground floor opportunity** - Be part of the MCP memory revolution from the beginning
- **Real impact** - Your contributions directly solve problems you face daily with AI agents
- **Growing ecosystem** - Help build the infrastructure that will power tomorrow's AI workflows
- **Phase 1 complete** - Proven foundation ready for the next breakthrough features
### **Development Priorities**
1. **Load Testing**: Validate deployment performance under real AI workloads
2. **Documentation**: Expand API documentation and LLM integration guides
3. **Performance**: AI model caching and memory optimization
4. **Features**: Additional MCP protocol tools and multi-tenant capabilities
---
## Join the MARM Community
**Help build the future of AI memory - no coding required!**
**Connect:** [MARM Discord](https://discord.gg/EuBsHvSRks) | [GitHub Discussions](https://github.com/Lyellr88/MARM-Systems/discussions)
### Easy Ways to Get Involved
- **Try the MCP server or Chatbot** and share your experience
- **Star the repo** if MARM solves a problem for you
- **Share on social** - help others discover memory-enhanced AI
- **Open [issues](https://github.com/Lyellr88/MARM-Systems/issues)** with bugs, feature requests, or use cases
- **Join discussions** about AI reliability and memory
### For Developers
- **Build integrations** - MCP tools, browser extensions, API wrappers
- **Enhance the memory system** - improve semantic search and storage
- **Expand platform support** - new deployment targets and integrations
- **Submit [Pull Requests](https://github.com/Lyellr88/MARM-Systems/pulls)** - Every PR helps MARM grow. Big or small, I review each with respect and openness to see how it can improve the project
### ⭐ Star the Project
If MARM helps with your AI memory needs, please star the repository to support development!
---
<div align="center">
[](https://star-history.com/#Lyellr88/MARM-Systems&Date)
</div>
---
### License & Usage Notice
This project is licensed under the MIT License. Forks and derivative works are permitted.
However, use of the **MARM name** and **version numbering** is reserved for releases from the [official MARM repository](https://github.com/Lyellr88/MARM-Systems).
Derivatives should clearly indicate they are unofficial or experimental.
---
## 📁 Project Documentation
### **Usage Guides**
- **[MCP-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MCP-HANDBOOK.md)** - Complete MCP server usage guide with commands, workflows, and examples
- **[PROTOCOL.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/PROTOCOL.md)** - Quick start commands and protocol reference
- **[FAQ.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/FAQ.md)** - Answers to common questions about using MARM
### **MCP Server Installation**
- **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** - Docker deployment (recommended)
- **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** - Windows installation guide
- **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** - Linux installation guide
- **[INSTALL-PLATFORMS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORMS.md)** - Platfrom installtion guide
### **Project Information**
- **[README.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/README.md)** - This file - ecosystem overview and MCP server guide
- **[CONTRIBUTING.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CONTRIBUTING.md)** - How to contribute to MARM
- **[DESCRIPTION.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/DESCRIPTION.md)** - Protocol purpose and vision overview
- **[CHANGELOG.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CHANGELOG.md)** - Version history and updates
- **[ROADMAP.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/ROADMAP.md)** - Planned features and development roadmap
- **[LICENSE](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/LICENSE)** - MIT license terms
---
mcp-name: io.github.Lyellr88/marm-mcp-server
>Built with ❤️ by MARM Systems - Universal MCP memory intelligence
Raw data
{
"_id": null,
"home_page": null,
"name": "marm-mcp-server",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Ryan Lyell <lyell@marmsystems.com>",
"keywords": "mcp, ai, memory, claude, assistant, protocol",
"author": null,
"author_email": "Ryan Lyell <lyell@marmsystems.com>",
"download_url": "https://files.pythonhosted.org/packages/68/7e/ceabab8d1d21d984db0e85ddfe88cab142c3d6e3061cc38f64ef580bdfee/marm_mcp_server-2.2.3.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n<picture>\n <img src=\"https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/media/marm-main.jpg\"\n alt=\"MARM - The AI That Remembers Your Conversations\"\n width=\"700\"\n height=\"350\"\n</picture>\n<h1 align=\"center\">MARM: The AI That Remembers Your Conversations</h1>\n\nMemory Accurate Response Mode v2.2.3 - The intelligent memory system for AI agents. Stop losing context. Stop hallucinations. Start controlling your LLM conversations. \n \n  [](https://github.com/MARM-Systems/MARM) [](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/LICENSE) [](https://python.org) [](https://fastapi.tiangolo.com) [](https://hub.docker.com/r/lyellr88/marm-mcp-server)\n \n[](https://github.com/Lyellr88/MARM-Systems)\n\n**Note:** This is the *official* MARM repository. All official versions and releases are managed here.\n\nForks may experiment, but official updates will always come from this repo. \n\n</div>\n\n---\n\n## \u26a1 **INSTANT SETUP** - Ready in 60 seconds\n\n**Already have MARM running?** Connect instantly:\n\n**Claude Code users:**\n\n```bash\n/mcp # \u2192 Instant connection to your MARM server!\n```\n\n**Claude Desktop users:**\n\n```json\n// Add to your MCP settings:\n{\n \"mcpServers\": {\n \"marm-memory\": {\n \"command\": \"docker\",\n \"args\": [\"exec\", \"marm-mcp-server\", \"python\", \"/app/server.py\"]\n }\n }\n}\n```\n\n**\u2705 19 memory tools loaded \u2705 Persistent sessions \u2705 Semantic search**\n\n**Don't have MARM yet?** Install + Connect:\n\n```bash\n# 1. Pull & Start (30 seconds)\ndocker run -d --name marm-mcp-server -p 8001:8001 lyellr88/marm-mcp-server:latest\n\n# 2. Connect to Claude (5 seconds)\n/mcp add marm-memory http://localhost:8001/mcp\n\n# 3. Activate (instant)\n/mcp\n```\n\n**\ud83c\udfaf You now have AI with perfect memory across all conversations!**\n\n---\n\n## \ud83d\ude80 Full Installation Guide\n\n**Docker (Fastest - 30 seconds):**\n\n```bash\ndocker pull lyellr88/marm-mcp-server:latest\ndocker run -d --name marm-mcp-server -p 8001:8001 lyellr88/marm-mcp-server:latest\nclaude mcp add marm-memory http://localhost:8001/mcp\n```\n\n**Quick Local Install:**\n\n```bash\npip install marm-mcp-server==2.2.3\ncd MARM-Systems/marm-mcp-server\n# Cross-platform: pip install marm-mcp-server==2.2.3\nclaude mcp add marm-memory http://localhost:8001/mcp\n```\n\n**Key Information:**\n\n- **Server Endpoint**: `http://localhost:8001/mcp`\n- **API Documentation**: `http://localhost:8001/docs`\n- **Supported Clients**: Claude Code, Qwen CLI, Gemini CLI, and any MCP-compatible LLM client or LLM platform\n\n**All Installation Options:**\n\n- **Docker** (Fastest): One command, works everywhere\n- **Automated Setup**: One command with dependency validation \n- **Manual Installation**: Step-by-step with virtual environment\n- **Quick Test**: Zero-configuration trial run\n\n**Choose your installation method:**\n\n| Installation Type | Guide | Best For |\n|-------------------|-------|----------|\n| **Docker** | **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** | Cross-platform, production deployment |\n| **Windows** | **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** | Native Windows development |\n| **Linux** | **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** | Native Linux development |\n| **Platforms** | **[INSTALL-PLATFORM.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORM.md)** | App & API integration |\n\n---\n\n## \ud83c\udfaf Why MARM?\n\nMARM (Memory Accurate Response Mode) is a comprehensive AI memory ecosystem I designed to solve the problem of context loss in large language models. What started as a simple protocol has evolved into a suite of tools that provide a persistent, intelligent, and cross-platform memory for any AI agent.\n\nThe MARM ecosystem consists of three main components:\n\n- **The MARM Protocol:** A set of rules and commands for structured, reliable AI interaction.\n- **The MARM Universal MCP Server:** A production-ready memory intelligence platform that provides a powerful, stateful backend for any MCP-compatible AI client.\n- **The MARM Chatbot:** A web-based interface for interacting with the MARM protocol directly.\n\nWhether you're a developer looking to build the next generation of AI agents, a researcher studying AI behavior, or simply a power user who wants to have more productive conversations with your AI, the MARM ecosystem provides the tools you need to unlock the full potential of large language models.\n\n<div align=\"center\">\n<picture>\n <img src=\"https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/media/google-overview.PNG\"\n alt=\"MARM - The AI That Remembers Your Conversations\"\n width=\"700\"\n height=\"350\"\n</picture>\n</div>\n<p align=\"center\">*Appears in Google AI Overview for AI memory protocol queries (as of Aug 2025)*\n\nThe newest addition to the ecosystem is MARM MCP it represents an emerging category of MCP server that integrates a complete protocol layer with intelligent memory systems. Built on FastAPI and SQLite, it combines the MARM protocol with semantic search, session management, and smart retrieval to bridge tool access with structured reasoning. This creates a more consistent, user-controlled LLM experience that goes beyond simple tool exposure.\n\n| **Category** | **Feature** | **Description** |\n|--------------|-------------|-----------------|\n| **\ud83e\udde0 Memory** | **Semantic Search** | Find memories by meaning using AI embeddings, not keyword matching |\n| | **Auto-Classification** | Content intelligently categorized (code, project, book, general) |\n| | **Cross-Session Memory** | Memories survive across different AI agent conversations |\n| | **Smart Recall** | Vector similarity search with context-aware intelligent fallbacks |\n| **\ud83e\udd1d Multi-AI** | **Unified Memory Layer** | Accessible by any connected LLM (Claude, Qwen, Gemini, etc.) |\n| | **Cross-Platform Intelligence** | Different AI agents learn from each other's interactions |\n| | **User-Controlled Memory** | Granular control over memory sharing and \"Bring Your Own History\" |\n| **\ud83c\udfd7\ufe0f Architecture** | **19 Complete MCP Tools** | Full Model Context Protocol implementation |\n| | **Database Optimization** | SQLite with WAL mode and connection pooling |\n| | **Rate Limiting** | IP-based protection for sustainable free service |\n| | **MCP Compliance** | Response size management for optimal performance |\n| | **Docker Ready** | Containerized deployment with health monitoring |\n| **\u26a1 Advanced** | **Usage Analytics** | Privacy-conscious insights for platform optimization |\n| | **Event-Driven System** | Self-managing architecture with comprehensive error isolation |\n| | **Structured Logging** | Development and debugging support with `structlog` |\n| | **Health Monitoring** | Real-time system status and performance tracking |\n\n---\n\n## Why I Built MARM \n\nMARM started with my own frustrations: AI losing context, repeating itself, and drifting off track. But I didn\u2019t stop there. I asked a simple question in a few AI subreddits: \n*\u201cWhat\u2019s the one thing you wish your LLM could do better?\u201d* \n\nThe replies echoed the same pain points: \n\n- Keep memory accurate \n- Give users more control \n- Be transparent, not a black box \n\nThat feedback confirmed the gap I already saw. I took those shared frustrations, found the middle ground, and built MARM. Early contributors validated the idea and shaped features, but the core system grew out of both personal trial and community insight. \n\nMARM is the result of combining individual persistence with collective needs, a protocol designed to solve what we all kept running into. \n\n### Discord\n\nJoin Discord for upcoming features and builds, plus a safe space to share your work and get constructive feedback.\n\n[MARM Discord](https://discord.gg/EuBsHvSRks)\n\n---\n\n## Before MARM vs After MARM\n\n**Without MARM:**\n\n- \"Wait, what were we discussing about the database schema?\"\n- AI repeats previous suggestions you already rejected\n- Loses track of project requirements mid-conversation\n- Starts from scratch every time you return\n\n**With MARM:**\n\n- AI references your logged project notes and decisions\n- Maintains context across multiple sessions \n- Builds on previous discussions instead of starting over\n- Remembers what works and what doesn't for your project\n\n---\n\n## Why Use MARM?\n\nModern LLMs often lose context or fabricate information. MARM introduces a session memory kernel, structured logs, and a user-controlled knowledge library. Anchoring the AI to *your* logic and data. It\u2019s more than a chatbot wrapper. It\u2019s a methodology for accountable AI.\n\n### Command Overview\n\n| **Category** | **Command** | **Function** |\n|--------------|-------------|--------------|\n| **Session** | `/start marm` | Activate protocol |\n| | `/refresh marm` | Reaffirm/reset context |\n| **Core** | `/log` | Start structured session logging |\n| | `/notebook` | Store key data |\n| | `/summary:` | Summarize and reseed sessions |\n| **Advanced** | `/deep dive` | Request context-aware response |\n| | `/show reasoning` | Reveal logic trail of last answer | \n\nNeed a walkthrough or troubleshooting help? The [`MARM-HANDBOOK.md`](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MARM-HANDBOOK.md) covers all aspects of using MARM.\n\n---\n\n# \ud83d\udee0\ufe0f MARM MCP Server Guide\n\nNow that you understand the ecosystem, here's info and how to actually use the MCP server with your AI agents\n\n---\n\n## \ud83d\udee0\ufe0f Complete MCP Tool Suite (19 Tools)\n\n| **Category** | **Tool** | **Description** |\n|--------------|----------|-----------------|\n| **\ud83e\udde0 Memory Intelligence** | `marm_smart_recall` | AI-powered semantic similarity search across all memories. Supports global search with `search_all=True` flag |\n| | `marm_contextual_log` | Intelligent auto-classifying memory storage using vector embeddings |\n| **\ud83d\ude80 Session Management** | `marm_start` | Activate MARM intelligent memory and response accuracy layers |\n| | `marm_refresh` | Refresh AI agent session state and reaffirm protocol adherence |\n| **\ud83d\udcda Logging System** | `marm_log_session` | Create or switch to named session container |\n| | `marm_log_entry` | Add structured log entry with auto-date formatting |\n| | `marm_log_show` | Display all entries and sessions (filterable) |\n| | `marm_log_delete` | Delete specified session or individual entries |\n| **\ud83d\udd04 Reasoning & Workflow** | `marm_summary` | Generate context-aware summaries with intelligent truncation for LLM conversations |\n| | `marm_context_bridge` | Smart context bridging for seamless AI agent workflow transitions |\n| **\ud83d\udcd4 Notebook Management** | `marm_notebook_add` | Add new notebook entry with semantic embeddings |\n| | `marm_notebook_use` | Activate entries as instructions (comma-separated) |\n| | `marm_notebook_show` | Display all saved keys and summaries |\n| | `marm_notebook_delete` | Delete specific notebook entry |\n| | `marm_notebook_clear` | Clear the active instruction list |\n| | `marm_notebook_status` | Show current active instruction list |\n| **\u2699\ufe0f System Utilities** | `marm_current_context` | Get current date/time for accurate log entry timestamps |\n| | `marm_system_info` | Comprehensive system information, health status, and loaded docs |\n| | `marm_reload_docs` | Reload documentation into memory system |\n\n---\n\n## \ud83c\udfd7\ufe0f Architecture Overview\n\n### **Core Technology Stack**\n\n```txt\nFastAPI (0.115.4) + FastAPI-MCP (0.4.0) - v2.2.3\n\u251c\u2500\u2500 SQLite with WAL Mode + Custom Connection Pooling \n\u251c\u2500\u2500 Sentence Transformers (all-MiniLM-L6-v2) + Semantic Search\n\u251c\u2500\u2500 Structured Logging (structlog) + Memory Monitoring (psutil)\n\u251c\u2500\u2500 IP-Based Rate Limiting + Usage Analytics\n\u251c\u2500\u2500 MCP Response Size Compliance (1MB limit)\n\u251c\u2500\u2500 Event-Driven Automation System\n\u251c\u2500\u2500 Docker Containerized Deployment + Health Monitoring\n\u2514\u2500\u2500 Advanced Memory Intelligence + Auto-Classification\n```\n\n### **Database Schema (5 Tables)**\n\n#### `memories` - Core Memory Storage\n\n```sql\nCREATE TABLE memories (\n id TEXT PRIMARY KEY,\n session_name TEXT NOT NULL,\n content TEXT NOT NULL,\n embedding BLOB, -- AI vector embeddings for semantic search\n timestamp TEXT NOT NULL,\n context_type TEXT DEFAULT 'general', -- Auto-classified content type\n metadata TEXT DEFAULT '{}',\n created_at TEXT DEFAULT CURRENT_TIMESTAMP\n);\n```\n\n#### `sessions` - Session Management\n\n```sql\nCREATE TABLE sessions (\n session_name TEXT PRIMARY KEY,\n marm_active BOOLEAN DEFAULT FALSE,\n created_at TEXT DEFAULT CURRENT_TIMESTAMP,\n last_accessed TEXT DEFAULT CURRENT_TIMESTAMP,\n metadata TEXT DEFAULT '{}'\n);\n```\n\n#### Plus: `log_entries`, `notebook_entries`, `user_settings`\n\n---\n\n## \ud83d\udcc8 Performance & Scalability\n\n### **Production Optimizations**\n\n- **Custom SQLite Connection Pool**: Thread-safe with configurable limits (default: 5)\n- **WAL Mode**: Write-Ahead Logging for concurrent access performance\n- **Lazy Loading**: Semantic models loaded only when needed (resource efficient)\n- **Intelligent Caching**: Memory usage optimization with cleanup cycles\n- **Response Size Management**: MCP 1MB compliance with smart truncation\n\n### **Rate Limiting Tiers**\n\n- **Default**: 60 requests/minute, 5min cooldown\n- **Memory Heavy**: 20 requests/minute, 10min cooldown (semantic search)\n- **Search Operations**: 30 requests/minute, 5min cooldown\n\n---\n\n## \ud83d\udcda Documentation for MCP\n\n| Guide Type | Document | Description |\n|------------|----------|-------------|\n| **Docker Setup** | **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** | Cross-platform, production deployment |\n| **Windows Setup** | **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** | Native Windows development |\n| **Linux Setup** | **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** | Native Linux development |\n| **Platform Integration** | **[INSTALL-PLATFORM.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORM.md)** | App & API integration |\n| **MCP Handbook** | **[MCP-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MCP-HANDBOOK.md)** | Complete usage guide with all 19 MCP tools, cross-app memory strategies, pro tips, and FAQ |\n\n---\n\n## \ud83c\udd9a Competitive Advantage\n\n### **vs. Basic MCP Implementations**\n\n| Feature | MARM v2.2.3 | Basic MCP Servers |\n|---------|-------------|-------------------|\n| **Memory Intelligence** | AI-powered semantic search with auto-classification | Basic key-value storage |\n| **Tool Coverage** | 19 complete MCP protocol tools | 3-5 basic wrappers | \n| **Scalability** | Database optimization + connection pooling | Single connection |\n| **MCP Compliance** | 1MB response size management | No size controls |\n| **Deployment** | Docker containerization + health monitoring | Local development only |\n| **Analytics** | Usage tracking + business intelligence | No tracking |\n| **Codebase Maturity** | 2,500+ lines professional code | 200-800 lines |\n\n---\n\n## \ud83e\udd1d Contributing\n\n**Aren't you sick of explaining every project you're working on to every LLM you work with?**\n\nMARM is building the solution to this. Support now to join a growing ecosystem - this is just **Phase 1 of a 3-part roadmap** and our next build will complement MARM like peanut butter and jelly.\n\n**Join the repo that's working to give YOU control over what is remembered and how it's remembered.**\n\n### **Why Contribute Now?**\n\n- **Ground floor opportunity** - Be part of the MCP memory revolution from the beginning\n- **Real impact** - Your contributions directly solve problems you face daily with AI agents\n- **Growing ecosystem** - Help build the infrastructure that will power tomorrow's AI workflows\n- **Phase 1 complete** - Proven foundation ready for the next breakthrough features\n\n### **Development Priorities**\n\n1. **Load Testing**: Validate deployment performance under real AI workloads\n2. **Documentation**: Expand API documentation and LLM integration guides\n3. **Performance**: AI model caching and memory optimization\n4. **Features**: Additional MCP protocol tools and multi-tenant capabilities\n\n---\n\n## Join the MARM Community\n\n**Help build the future of AI memory - no coding required!**\n\n**Connect:** [MARM Discord](https://discord.gg/EuBsHvSRks) | [GitHub Discussions](https://github.com/Lyellr88/MARM-Systems/discussions)\n\n### Easy Ways to Get Involved\n\n- **Try the MCP server or Chatbot** and share your experience\n- **Star the repo** if MARM solves a problem for you\n- **Share on social** - help others discover memory-enhanced AI\n- **Open [issues](https://github.com/Lyellr88/MARM-Systems/issues)** with bugs, feature requests, or use cases\n- **Join discussions** about AI reliability and memory\n\n### For Developers\n\n- **Build integrations** - MCP tools, browser extensions, API wrappers\n- **Enhance the memory system** - improve semantic search and storage\n- **Expand platform support** - new deployment targets and integrations\n- **Submit [Pull Requests](https://github.com/Lyellr88/MARM-Systems/pulls)** - Every PR helps MARM grow. Big or small, I review each with respect and openness to see how it can improve the project\n\n### \u2b50 Star the Project\n\nIf MARM helps with your AI memory needs, please star the repository to support development!\n\n---\n\n<div align=\"center\">\n\n [](https://star-history.com/#Lyellr88/MARM-Systems&Date)\n</div>\n\n---\n\n### License & Usage Notice\n\nThis project is licensed under the MIT License. Forks and derivative works are permitted. \n\nHowever, use of the **MARM name** and **version numbering** is reserved for releases from the [official MARM repository](https://github.com/Lyellr88/MARM-Systems).\n\nDerivatives should clearly indicate they are unofficial or experimental.\n\n---\n\n## \ud83d\udcc1 Project Documentation\n\n### **Usage Guides**\n\n- **[MCP-HANDBOOK.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/MCP-HANDBOOK.md)** - Complete MCP server usage guide with commands, workflows, and examples\n- **[PROTOCOL.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/PROTOCOL.md)** - Quick start commands and protocol reference\n- **[FAQ.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/FAQ.md)** - Answers to common questions about using MARM\n\n### **MCP Server Installation**\n\n- **[INSTALL-DOCKER.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-DOCKER.md)** - Docker deployment (recommended)\n- **[INSTALL-WINDOWS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-WINDOWS.md)** - Windows installation guide\n- **[INSTALL-LINUX.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-LINUX.md)** - Linux installation guide\n- **[INSTALL-PLATFORMS.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/INSTALL-PLATFORMS.md)** - Platfrom installtion guide\n\n### **Project Information**\n\n- **[README.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/README.md)** - This file - ecosystem overview and MCP server guide\n- **[CONTRIBUTING.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CONTRIBUTING.md)** - How to contribute to MARM\n- **[DESCRIPTION.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/DESCRIPTION.md)** - Protocol purpose and vision overview\n- **[CHANGELOG.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/CHANGELOG.md)** - Version history and updates\n- **[ROADMAP.md](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/ROADMAP.md)** - Planned features and development roadmap\n- **[LICENSE](https://github.com/Lyellr88/MARM-Systems/blob/MARM-main/docs/LICENSE)** - MIT license terms\n\n---\n\nmcp-name: io.github.Lyellr88/marm-mcp-server\n\n>Built with \u2764\ufe0f by MARM Systems - Universal MCP memory intelligence\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "MARM-Systems is a complete protocol and platform\u2014combining an advanced memory backend, modular semantic search, and agent-to-agent coordination with a scientifically structured, community-vetted methodology for reasoning, session recall, and collaborative AI workflows. More then just a set of tools, it's a complete AI memory ecosystem",
"version": "2.2.3",
"project_urls": {
"Bug Tracker": "https://github.com/Lyellr88/MARM-Systems/issues",
"Docker Hub": "https://hub.docker.com/r/lyellr88/marm-mcp-server",
"Documentation": "https://docs.marmsystems.com",
"Homepage": "https://marmsystems.com",
"Repository": "https://github.com/Lyellr88/MARM-Systems"
},
"split_keywords": [
"mcp",
" ai",
" memory",
" claude",
" assistant",
" protocol"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "2bdc400134e606074da9b13deb849ce128762e8d8cacb2619bd2d4c0bb07aebb",
"md5": "000e383d2da89eda2417ad79c1fbd8f4",
"sha256": "3c452c4bd3b372527fccb5f881a007ac39dd0920f88c58ce9bc111e500209015"
},
"downloads": -1,
"filename": "marm_mcp_server-2.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "000e383d2da89eda2417ad79c1fbd8f4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 45515,
"upload_time": "2025-09-19T05:50:34",
"upload_time_iso_8601": "2025-09-19T05:50:34.995282Z",
"url": "https://files.pythonhosted.org/packages/2b/dc/400134e606074da9b13deb849ce128762e8d8cacb2619bd2d4c0bb07aebb/marm_mcp_server-2.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "687eceabab8d1d21d984db0e85ddfe88cab142c3d6e3061cc38f64ef580bdfee",
"md5": "6b6b79f2c08fd890a06b6f635dcf1938",
"sha256": "9e77021eb53d0a124ea03f4f5ca666e856c9c86168566c6beb03a63ddd2d529e"
},
"downloads": -1,
"filename": "marm_mcp_server-2.2.3.tar.gz",
"has_sig": false,
"md5_digest": "6b6b79f2c08fd890a06b6f635dcf1938",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 60957,
"upload_time": "2025-09-19T05:50:36",
"upload_time_iso_8601": "2025-09-19T05:50:36.564028Z",
"url": "https://files.pythonhosted.org/packages/68/7e/ceabab8d1d21d984db0e85ddfe88cab142c3d6e3061cc38f64ef580bdfee/marm_mcp_server-2.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-19 05:50:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Lyellr88",
"github_project": "MARM-Systems",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "marm-mcp-server"
}