# InMemory - Enhanced Memory Management for AI
<p align="center">
<strong>π§ Long-term memory for AI Agents with zero-setup simplicity</strong>
</p>
<p align="center">
<strong>β‘ Zero Dependencies β’ π Instant Setup β’ πΌ Enterprise Ready</strong>
</p>
## π₯ Key Features
- **π Zero Setup**: `pip install inmemory` and start using immediately
- **ποΈ Flexible Architecture**: File-based β API Server β Enterprise MongoDB
- **π Advanced Search**: Semantic, temporal, tag-based, and people-based filtering
- **π« Duplicate Detection**: Prevents storing similar memories
- **βοΈ Configurable Backends**: File storage, MongoDB, PostgreSQL (coming soon)
- **π Multiple Interfaces**: Python SDK, REST API, MCP server
## π Quick Start
### Instant Usage (Zero Dependencies)
```bash
pip install inmemory
```
```python
from inmemory import Memory
# Works immediately - no setup required!
memory = Memory()
# Add memories with rich metadata
memory.add(
"I love pizza but hate broccoli",
user_id="alice",
tags="food,preferences"
)
memory.add(
"Meeting with Bob and Carol about Q4 planning tomorrow at 3pm",
user_id="alice",
tags="work,meeting",
people_mentioned="Bob,Carol",
topic_category="planning"
)
# Search memories
results = memory.search("pizza", user_id="alice")
for result in results["results"]:
print(f"Memory: {result['memory']}")
print(f"Tags: {result['tags']}")
print(f"Score: {result['score']}")
# Advanced searches
work_memories = memory.search_by_tags(["work"], user_id="alice")
people_memories = memory.search_by_people(["Bob"], user_id="alice")
recent_memories = memory.temporal_search("today", user_id="alice")
```
### API Server Mode
```bash
pip install inmemory[server]
# Start API server (file-based backend)
inmemory serve --port 8080
# Or with MongoDB backend (requires MongoDB)
inmemory serve --storage-type mongodb --port 8080
```
### Enterprise Mode (Your Dashboard Integration)
```bash
pip install inmemory[enterprise]
# Set environment variables
export MONGODB_URI="mongodb://localhost:27017/inmemory"
export GOOGLE_CLIENT_ID="your-google-client-id"
export GOOGLE_CLIENT_SECRET="your-google-client-secret"
# Start enterprise server (unchanged from before)
python main.py
```
## π¦ Installation Options
| Mode | Command | Dependencies | Use Case |
|------|---------|--------------|----------|
| **Basic SDK** | `pip install inmemory` | Zero external deps | Development, testing, simple apps |
| **API Server** | `pip install inmemory[server]` | FastAPI, Uvicorn | Integration, dashboards |
| **Enterprise** | `pip install inmemory[enterprise]` | MongoDB, OAuth | Production, multi-user |
| **Full** | `pip install inmemory[full]` | Everything + MCP | Complete installation |
## ποΈ Architecture
```
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β InMemory Package β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β SDK Layer β Memory Class (Primary Interface) β
β API Layer β FastAPI Server (Optional) β
β Storage Layerβ File (Default) β MongoDB (Enterprise) β
β Search Layer β Enhanced Search Engine + Qdrant β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
```
## π‘ Core API Reference
### Memory Class
```python
from inmemory import Memory
# Initialize with different backends
memory = Memory() # Auto-detect (file by default)
memory = Memory(storage_type="file") # Force file storage
memory = Memory(storage_type="mongodb") # Force MongoDB (requires deps)
# Memory operations
result = memory.add(content, user_id, tags=None, people_mentioned=None, topic_category=None)
results = memory.search(query, user_id, limit=10, tags=None, temporal_filter=None)
memories = memory.get_all(user_id, limit=100)
result = memory.delete(memory_id, user_id)
# Advanced search
results = memory.search_by_tags(["work", "important"], user_id, match_all=True)
results = memory.search_by_people(["Alice", "Bob"], user_id)
results = memory.temporal_search("yesterday", user_id, semantic_query="meetings")
# User management
result = memory.create_user(user_id, email="user@example.com")
api_key = memory.generate_api_key(user_id, name="my-app")
keys = memory.list_api_keys(user_id)
stats = memory.get_user_stats(user_id)
```
### Configuration
```python
from inmemory import InMemoryConfig, Memory
# Custom configuration
config = InMemoryConfig(
storage={
"type": "file", # or "mongodb"
"path": "~/my-memories" # for file storage
},
auth={
"type": "simple", # or "oauth", "api_key"
"default_user": "my_user"
},
qdrant={
"host": "localhost",
"port": 6333
}
)
memory = Memory(config=config)
```
## π REST API Endpoints
When running in server mode (`inmemory serve`), these endpoints are available:
| Method | Endpoint | Description |
|--------|----------|-------------|
| `POST` | `/v1/memories` | Add new memory |
| `GET` | `/v1/memories` | Get user's memories |
| `DELETE` | `/v1/memories/{id}` | Delete specific memory |
| `POST` | `/v1/search` | Search memories |
| `POST` | `/v1/temporal-search` | Temporal search |
| `POST` | `/v1/search-by-tags` | Tag-based search |
| `POST` | `/v1/search-by-people` | People-based search |
| `GET` | `/v1/health` | Health check |
## π§ Configuration Options
### Environment Variables
```bash
# Storage backend
export INMEMORY_STORAGE_TYPE="file" # or "mongodb"
export INMEMORY_DATA_DIR="~/.inmemory" # for file storage
export MONGODB_URI="mongodb://localhost:27017/inmemory" # for mongodb
# Server settings
export INMEMORY_HOST="0.0.0.0"
export INMEMORY_PORT="8081"
# Qdrant settings
export QDRANT_HOST="localhost"
export QDRANT_PORT="6333"
```
### YAML Configuration
Create `~/.inmemory/config.yaml`:
```yaml
storage:
type: "file" # or "mongodb"
path: "~/.inmemory/data"
auth:
type: "simple" # or "oauth", "api_key"
default_user: "user123"
qdrant:
host: "localhost"
port: 6333
embedding:
provider: "ollama"
model: "nomic-embed-text"
ollama_host: "http://localhost:11434"
```
## π Deployment
### Single File Deployment
```bash
# Just run the server - file storage included
inmemory serve --port 8080
```
### Docker Deployment
```bash
# Simple mode (file storage)
docker run -p 8080:8080 -v inmemory-data:/root/.inmemory inmemory:latest
# Enterprise mode (MongoDB)
docker-compose up # Uses provided docker-compose.yml
```
### Production Deployment
```bash
# Enterprise mode with MongoDB
export MONGODB_URI="mongodb://prod-mongo:27017/inmemory"
export GOOGLE_CLIENT_ID="your-prod-client-id"
export GOOGLE_CLIENT_SECRET="your-prod-client-secret"
inmemory serve --host 0.0.0.0 --port 8080
```
## π Migration Between Modes
Easily migrate from simple file storage to enterprise MongoDB:
```python
from inmemory.stores import FileBasedStore, MongoDBStore
# Initialize both backends
file_store = FileBasedStore()
mongo_store = MongoDBStore(mongodb_uri="mongodb://localhost:27017")
# Migrate all data
success = mongo_store.migrate_from_file_store(file_store)
print(f"Migration {'successful' if success else 'failed'}!")
```
## π§ͺ Development & Testing
```bash
# Install with development tools
pip install inmemory[dev]
# Run tests
inmemory test
# Check configuration
inmemory config
# View storage statistics
inmemory stats
# Initialize with sample data
inmemory init
```
## π€ Integration Examples
### Personal AI Assistant
```python
from inmemory import Memory
from openai import OpenAI
class PersonalAssistant:
def __init__(self):
self.memory = Memory()
self.llm = OpenAI()
def chat(self, user_input: str, user_id: str) -> str:
# Get relevant memories
memories = self.memory.search(user_input, user_id=user_id, limit=5)
context = "\n".join([m['memory'] for m in memories['results']])
# Generate response with context
response = self.llm.chat.completions.create(
model="gpt-4o-mini",
messages=[
{"role": "system", "content": f"Context: {context}"},
{"role": "user", "content": user_input}
]
)
# Store conversation
self.memory.add(f"User: {user_input}", user_id=user_id)
self.memory.add(f"Assistant: {response.choices[0].message.content}", user_id=user_id)
return response.choices[0].message.content
```
### Customer Support Bot
```python
from inmemory import Memory
class SupportBot:
def __init__(self):
self.memory = Memory()
def handle_ticket(self, customer_id: str, issue: str):
# Check customer history
history = self.memory.search_by_people([customer_id], user_id="support")
similar_issues = self.memory.search(issue, user_id="support", limit=3)
# Generate contextual response based on history
response = self.generate_response(issue, history, similar_issues)
# Store interaction
self.memory.add(
f"Customer {customer_id} reported: {issue}",
user_id="support",
tags="ticket,customer_support",
people_mentioned=customer_id,
topic_category="support"
)
return response
```
## π Documentation
- **[Installation Guide](docs/installation-guide.md)**: Detailed installation and usage
- **[Architecture Plan](docs/open-source-architecture-plan.md)**: Technical architecture details
- **[API Reference](http://localhost:8081/docs)**: Interactive API documentation (when server running)
## π’ Enterprise Features
For enterprise deployments, InMemory provides:
- **Multi-user Support**: MongoDB backend with user isolation
- **OAuth Integration**: Google OAuth for dashboard authentication
- **Scalable Storage**: MongoDB collections per user
- **API Key Management**: Secure key generation and management
- **Dashboard Ready**: REST API for your private dashboard integration
## π€ MCP Server Integration
InMemory works seamlessly with MCP (Model Context Protocol) for AI agent integration:
```bash
# Separate repository for MCP server
git clone https://github.com/you/inmemory-mcp
cd inmemory-mcp
pip install -e .
# Configure to connect to any InMemory API
export INMEMORY_API_URL="http://localhost:8080"
python src/server.py
```
## π οΈ Requirements
### Minimal Installation
- **Python**: 3.12+
- **Qdrant**: Vector database for embeddings
- **Ollama**: Local embeddings (or OpenAI API key)
### Enterprise Installation
- **MongoDB**: User management and authentication
- **Google OAuth**: Dashboard authentication
## π― Roadmap
- [x] **Storage Abstraction**: File-based and MongoDB backends
- [x] **CLI Tools**: Easy server management
- [ ] **PostgreSQL Backend**: Alternative to MongoDB
- [ ] **TypeScript SDK**: Cross-language support
- [ ] **More Vector DBs**: Chroma, Pinecone integration
- [ ] **Cloud Storage**: S3, GCS backends
## π€ Contributing
We welcome contributions! Please see:
- **Issues**: Report bugs and request features
- **Pull Requests**: Follow our coding standards (ruff, pre-commit)
- **Documentation**: Help improve our guides
```bash
# Development setup
git clone https://github.com/you/inmemory
cd inmemory
pip install -e .[dev]
pre-commit install
# Run tests
inmemory test
pytest
```
## π License
This project is licensed under the Apache 2.0 License - see the [LICENSE](LICENSE.txt) file for details.
## π Acknowledgments
- **FastAPI**: Excellent API framework
- **Qdrant**: High-performance vector database
- **Pydantic**: Data validation and configuration
---
<p align="center">
<strong>Start simple. Scale seamlessly. π</strong>
</p>
Raw data
{
"_id": null,
"home_page": null,
"name": "inmemory",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": "InMemory Team <contact@inmemory.dev>",
"keywords": "memory, ai, ml, vector, search, embeddings, agents, llm, artificial-intelligence, qdrant, semantic-search",
"author": null,
"author_email": "InMemory Team <contact@inmemory.dev>",
"download_url": "https://files.pythonhosted.org/packages/bf/91/cedd65a3dd9a9315cd5716b8d2ae560d89af00c3876d03a5fab41fdd7572/inmemory-0.1.0.tar.gz",
"platform": null,
"description": "# InMemory - Enhanced Memory Management for AI\n\n<p align=\"center\">\n <strong>\ud83e\udde0 Long-term memory for AI Agents with zero-setup simplicity</strong>\n</p>\n\n<p align=\"center\">\n <strong>\u26a1 Zero Dependencies \u2022 \ud83d\ude80 Instant Setup \u2022 \ud83d\udcbc Enterprise Ready</strong>\n</p>\n\n\n## \ud83d\udd25 Key Features\n\n- **\ud83d\ude80 Zero Setup**: `pip install inmemory` and start using immediately\n- **\ud83c\udfd7\ufe0f Flexible Architecture**: File-based \u2192 API Server \u2192 Enterprise MongoDB\n- **\ud83d\udd0d Advanced Search**: Semantic, temporal, tag-based, and people-based filtering\n- **\ud83d\udeab Duplicate Detection**: Prevents storing similar memories\n- **\u2699\ufe0f Configurable Backends**: File storage, MongoDB, PostgreSQL (coming soon)\n- **\ud83c\udf10 Multiple Interfaces**: Python SDK, REST API, MCP server\n\n## \ud83d\ude80 Quick Start\n\n### Instant Usage (Zero Dependencies)\n\n```bash\npip install inmemory\n```\n\n```python\nfrom inmemory import Memory\n\n# Works immediately - no setup required!\nmemory = Memory()\n\n# Add memories with rich metadata\nmemory.add(\n \"I love pizza but hate broccoli\",\n user_id=\"alice\",\n tags=\"food,preferences\"\n)\n\nmemory.add(\n \"Meeting with Bob and Carol about Q4 planning tomorrow at 3pm\",\n user_id=\"alice\",\n tags=\"work,meeting\",\n people_mentioned=\"Bob,Carol\",\n topic_category=\"planning\"\n)\n\n# Search memories\nresults = memory.search(\"pizza\", user_id=\"alice\")\nfor result in results[\"results\"]:\n print(f\"Memory: {result['memory']}\")\n print(f\"Tags: {result['tags']}\")\n print(f\"Score: {result['score']}\")\n\n# Advanced searches\nwork_memories = memory.search_by_tags([\"work\"], user_id=\"alice\")\npeople_memories = memory.search_by_people([\"Bob\"], user_id=\"alice\")\nrecent_memories = memory.temporal_search(\"today\", user_id=\"alice\")\n```\n\n### API Server Mode\n\n```bash\npip install inmemory[server]\n\n# Start API server (file-based backend)\ninmemory serve --port 8080\n\n# Or with MongoDB backend (requires MongoDB)\ninmemory serve --storage-type mongodb --port 8080\n```\n\n### Enterprise Mode (Your Dashboard Integration)\n\n```bash\npip install inmemory[enterprise]\n\n# Set environment variables\nexport MONGODB_URI=\"mongodb://localhost:27017/inmemory\"\nexport GOOGLE_CLIENT_ID=\"your-google-client-id\"\nexport GOOGLE_CLIENT_SECRET=\"your-google-client-secret\"\n\n# Start enterprise server (unchanged from before)\npython main.py\n```\n\n## \ud83d\udce6 Installation Options\n\n| Mode | Command | Dependencies | Use Case |\n|------|---------|--------------|----------|\n| **Basic SDK** | `pip install inmemory` | Zero external deps | Development, testing, simple apps |\n| **API Server** | `pip install inmemory[server]` | FastAPI, Uvicorn | Integration, dashboards |\n| **Enterprise** | `pip install inmemory[enterprise]` | MongoDB, OAuth | Production, multi-user |\n| **Full** | `pip install inmemory[full]` | Everything + MCP | Complete installation |\n\n## \ud83c\udfd7\ufe0f Architecture\n\n```\n\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\n\u2502 InMemory Package \u2502\n\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\n\u2502 SDK Layer \u2502 Memory Class (Primary Interface) \u2502\n\u2502 API Layer \u2502 FastAPI Server (Optional) \u2502\n\u2502 Storage Layer\u2502 File (Default) \u2502 MongoDB (Enterprise) \u2502\n\u2502 Search Layer \u2502 Enhanced Search Engine + Qdrant \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\n```\n\n## \ud83d\udca1 Core API Reference\n\n### Memory Class\n\n```python\nfrom inmemory import Memory\n\n# Initialize with different backends\nmemory = Memory() # Auto-detect (file by default)\nmemory = Memory(storage_type=\"file\") # Force file storage\nmemory = Memory(storage_type=\"mongodb\") # Force MongoDB (requires deps)\n\n# Memory operations\nresult = memory.add(content, user_id, tags=None, people_mentioned=None, topic_category=None)\nresults = memory.search(query, user_id, limit=10, tags=None, temporal_filter=None)\nmemories = memory.get_all(user_id, limit=100)\nresult = memory.delete(memory_id, user_id)\n\n# Advanced search\nresults = memory.search_by_tags([\"work\", \"important\"], user_id, match_all=True)\nresults = memory.search_by_people([\"Alice\", \"Bob\"], user_id)\nresults = memory.temporal_search(\"yesterday\", user_id, semantic_query=\"meetings\")\n\n# User management\nresult = memory.create_user(user_id, email=\"user@example.com\")\napi_key = memory.generate_api_key(user_id, name=\"my-app\")\nkeys = memory.list_api_keys(user_id)\nstats = memory.get_user_stats(user_id)\n```\n\n### Configuration\n\n```python\nfrom inmemory import InMemoryConfig, Memory\n\n# Custom configuration\nconfig = InMemoryConfig(\n storage={\n \"type\": \"file\", # or \"mongodb\"\n \"path\": \"~/my-memories\" # for file storage\n },\n auth={\n \"type\": \"simple\", # or \"oauth\", \"api_key\"\n \"default_user\": \"my_user\"\n },\n qdrant={\n \"host\": \"localhost\",\n \"port\": 6333\n }\n)\n\nmemory = Memory(config=config)\n```\n\n## \ud83c\udf10 REST API Endpoints\n\nWhen running in server mode (`inmemory serve`), these endpoints are available:\n\n| Method | Endpoint | Description |\n|--------|----------|-------------|\n| `POST` | `/v1/memories` | Add new memory |\n| `GET` | `/v1/memories` | Get user's memories |\n| `DELETE` | `/v1/memories/{id}` | Delete specific memory |\n| `POST` | `/v1/search` | Search memories |\n| `POST` | `/v1/temporal-search` | Temporal search |\n| `POST` | `/v1/search-by-tags` | Tag-based search |\n| `POST` | `/v1/search-by-people` | People-based search |\n| `GET` | `/v1/health` | Health check |\n\n## \ud83d\udd27 Configuration Options\n\n### Environment Variables\n\n```bash\n# Storage backend\nexport INMEMORY_STORAGE_TYPE=\"file\" # or \"mongodb\"\nexport INMEMORY_DATA_DIR=\"~/.inmemory\" # for file storage\nexport MONGODB_URI=\"mongodb://localhost:27017/inmemory\" # for mongodb\n\n# Server settings\nexport INMEMORY_HOST=\"0.0.0.0\"\nexport INMEMORY_PORT=\"8081\"\n\n# Qdrant settings\nexport QDRANT_HOST=\"localhost\"\nexport QDRANT_PORT=\"6333\"\n```\n\n### YAML Configuration\n\nCreate `~/.inmemory/config.yaml`:\n\n```yaml\nstorage:\n type: \"file\" # or \"mongodb\"\n path: \"~/.inmemory/data\"\n\nauth:\n type: \"simple\" # or \"oauth\", \"api_key\"\n default_user: \"user123\"\n\nqdrant:\n host: \"localhost\"\n port: 6333\n\nembedding:\n provider: \"ollama\"\n model: \"nomic-embed-text\"\n ollama_host: \"http://localhost:11434\"\n```\n\n## \ud83d\ude80 Deployment\n\n### Single File Deployment\n```bash\n# Just run the server - file storage included\ninmemory serve --port 8080\n```\n\n### Docker Deployment\n```bash\n# Simple mode (file storage)\ndocker run -p 8080:8080 -v inmemory-data:/root/.inmemory inmemory:latest\n\n# Enterprise mode (MongoDB) \ndocker-compose up # Uses provided docker-compose.yml\n```\n\n### Production Deployment\n```bash\n# Enterprise mode with MongoDB\nexport MONGODB_URI=\"mongodb://prod-mongo:27017/inmemory\"\nexport GOOGLE_CLIENT_ID=\"your-prod-client-id\"\nexport GOOGLE_CLIENT_SECRET=\"your-prod-client-secret\"\n\ninmemory serve --host 0.0.0.0 --port 8080\n```\n\n## \ud83d\udd04 Migration Between Modes\n\nEasily migrate from simple file storage to enterprise MongoDB:\n\n```python\nfrom inmemory.stores import FileBasedStore, MongoDBStore\n\n# Initialize both backends\nfile_store = FileBasedStore()\nmongo_store = MongoDBStore(mongodb_uri=\"mongodb://localhost:27017\")\n\n# Migrate all data\nsuccess = mongo_store.migrate_from_file_store(file_store)\nprint(f\"Migration {'successful' if success else 'failed'}!\")\n```\n\n## \ud83e\uddea Development & Testing\n\n```bash\n# Install with development tools\npip install inmemory[dev]\n\n# Run tests\ninmemory test\n\n# Check configuration\ninmemory config\n\n# View storage statistics \ninmemory stats\n\n# Initialize with sample data\ninmemory init\n```\n\n## \ud83e\udd1d Integration Examples\n\n### Personal AI Assistant\n```python\nfrom inmemory import Memory\nfrom openai import OpenAI\n\nclass PersonalAssistant:\n def __init__(self):\n self.memory = Memory()\n self.llm = OpenAI()\n \n def chat(self, user_input: str, user_id: str) -> str:\n # Get relevant memories\n memories = self.memory.search(user_input, user_id=user_id, limit=5)\n context = \"\\n\".join([m['memory'] for m in memories['results']])\n \n # Generate response with context\n response = self.llm.chat.completions.create(\n model=\"gpt-4o-mini\",\n messages=[\n {\"role\": \"system\", \"content\": f\"Context: {context}\"},\n {\"role\": \"user\", \"content\": user_input}\n ]\n )\n \n # Store conversation\n self.memory.add(f\"User: {user_input}\", user_id=user_id)\n self.memory.add(f\"Assistant: {response.choices[0].message.content}\", user_id=user_id)\n \n return response.choices[0].message.content\n```\n\n### Customer Support Bot\n```python\nfrom inmemory import Memory\n\nclass SupportBot:\n def __init__(self):\n self.memory = Memory()\n \n def handle_ticket(self, customer_id: str, issue: str):\n # Check customer history\n history = self.memory.search_by_people([customer_id], user_id=\"support\")\n similar_issues = self.memory.search(issue, user_id=\"support\", limit=3)\n \n # Generate contextual response based on history\n response = self.generate_response(issue, history, similar_issues)\n \n # Store interaction\n self.memory.add(\n f\"Customer {customer_id} reported: {issue}\",\n user_id=\"support\",\n tags=\"ticket,customer_support\",\n people_mentioned=customer_id,\n topic_category=\"support\"\n )\n \n return response\n```\n\n## \ud83d\udcda Documentation\n\n- **[Installation Guide](docs/installation-guide.md)**: Detailed installation and usage\n- **[Architecture Plan](docs/open-source-architecture-plan.md)**: Technical architecture details\n- **[API Reference](http://localhost:8081/docs)**: Interactive API documentation (when server running)\n\n## \ud83c\udfe2 Enterprise Features\n\nFor enterprise deployments, InMemory provides:\n\n- **Multi-user Support**: MongoDB backend with user isolation\n- **OAuth Integration**: Google OAuth for dashboard authentication\n- **Scalable Storage**: MongoDB collections per user\n- **API Key Management**: Secure key generation and management\n- **Dashboard Ready**: REST API for your private dashboard integration\n\n## \ud83e\udd16 MCP Server Integration\n\nInMemory works seamlessly with MCP (Model Context Protocol) for AI agent integration:\n\n```bash\n# Separate repository for MCP server\ngit clone https://github.com/you/inmemory-mcp\ncd inmemory-mcp\npip install -e .\n\n# Configure to connect to any InMemory API\nexport INMEMORY_API_URL=\"http://localhost:8080\"\npython src/server.py\n```\n\n## \ud83d\udee0\ufe0f Requirements\n\n### Minimal Installation\n- **Python**: 3.12+\n- **Qdrant**: Vector database for embeddings\n- **Ollama**: Local embeddings (or OpenAI API key)\n\n### Enterprise Installation\n- **MongoDB**: User management and authentication\n- **Google OAuth**: Dashboard authentication\n\n## \ud83c\udfaf Roadmap\n\n- [x] **Storage Abstraction**: File-based and MongoDB backends\n- [x] **CLI Tools**: Easy server management\n- [ ] **PostgreSQL Backend**: Alternative to MongoDB\n- [ ] **TypeScript SDK**: Cross-language support\n- [ ] **More Vector DBs**: Chroma, Pinecone integration\n- [ ] **Cloud Storage**: S3, GCS backends\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Please see:\n\n- **Issues**: Report bugs and request features\n- **Pull Requests**: Follow our coding standards (ruff, pre-commit)\n- **Documentation**: Help improve our guides\n\n```bash\n# Development setup\ngit clone https://github.com/you/inmemory\ncd inmemory\npip install -e .[dev]\npre-commit install\n\n# Run tests\ninmemory test\npytest\n```\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the Apache 2.0 License - see the [LICENSE](LICENSE.txt) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- **FastAPI**: Excellent API framework\n- **Qdrant**: High-performance vector database\n- **Pydantic**: Data validation and configuration\n\n---\n\n<p align=\"center\">\n <strong>Start simple. Scale seamlessly. \ud83d\ude80</strong>\n</p>\n",
"bugtrack_url": null,
"license": null,
"summary": "Enhanced Memory Management for AI Agents with zero-setup simplicity",
"version": "0.1.0",
"project_urls": {
"Changelog": "https://github.com/shrijayan/inmemory/blob/main/CHANGELOG.md",
"Documentation": "https://github.com/shrijayan/inmemory#readme",
"Homepage": "https://github.com/shrijayan/inmemory",
"Issues": "https://github.com/shrijayan/inmemory/issues",
"Repository": "https://github.com/shrijayan/inmemory"
},
"split_keywords": [
"memory",
" ai",
" ml",
" vector",
" search",
" embeddings",
" agents",
" llm",
" artificial-intelligence",
" qdrant",
" semantic-search"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "a2eb2be54ac11a6bc3fb1bdac492455036cbb1a1f07ccf368f96947dd094a640",
"md5": "79a546b34e3bb87456c873566206715b",
"sha256": "f36735c07871552bfd536735035b2b365f7d236ee632f28708af600d880adf0d"
},
"downloads": -1,
"filename": "inmemory-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "79a546b34e3bb87456c873566206715b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 71343,
"upload_time": "2025-08-16T20:49:51",
"upload_time_iso_8601": "2025-08-16T20:49:51.055841Z",
"url": "https://files.pythonhosted.org/packages/a2/eb/2be54ac11a6bc3fb1bdac492455036cbb1a1f07ccf368f96947dd094a640/inmemory-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "bf91cedd65a3dd9a9315cd5716b8d2ae560d89af00c3876d03a5fab41fdd7572",
"md5": "aa28d5985b033d9a7af6974b85d0969d",
"sha256": "eaccfd70837715b0fe3181e9587daa4f943bfbf297a35b74f87fa31b5c9cb474"
},
"downloads": -1,
"filename": "inmemory-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "aa28d5985b033d9a7af6974b85d0969d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 64416,
"upload_time": "2025-08-16T20:49:52",
"upload_time_iso_8601": "2025-08-16T20:49:52.609798Z",
"url": "https://files.pythonhosted.org/packages/bf/91/cedd65a3dd9a9315cd5716b8d2ae560d89af00c3876d03a5fab41fdd7572/inmemory-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-16 20:49:52",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "shrijayan",
"github_project": "inmemory",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "fastapi",
"specs": []
},
{
"name": "fastmcp",
"specs": []
},
{
"name": "qdrant_client",
"specs": []
},
{
"name": "ollama",
"specs": []
},
{
"name": "pymongo",
"specs": []
},
{
"name": "python-dotenv",
"specs": []
}
],
"lcname": "inmemory"
}