# LocalGenius
A personal MCP (Model Context Protocol) server that acts as a RAG (Retrieval-Augmented Generation) source for your LLM. Works with Claude Desktop!
## Features
- 🔍 Semantic search through your local documents
- 🤖 RAG-powered Q&A using OpenAI GPT-4o
- 📁 Local vector store using SQLite + FAISS
- 🖥️ Claude Desktop integration via MCP
- 🎨 Rich CLI interface with onboarding wizard
- ⚡ Fast indexing and retrieval
## Installation
```bash
# Install from PyPI (recommended)
pip install localgenius
# Or install from source
git clone https://github.com/yourusername/localgenius.git
cd localgenius
pip install -e .
```
## Quick Start
```bash
# 1. Set your OpenAI API key
export OPENAI_API_KEY="sk-..."
# 2. First run - interactive setup wizard
localgenius init
# 3. Add documents to index
localgenius add-source /path/to/documents --name "My Docs"
# 4. Index the documents
localgenius index
# 5. Test it out
localgenius ask "What are the main topics in my documents?"
```
## Claude Desktop Integration
### Automatic Setup (Recommended)
```bash
# Automatically configure Claude Desktop
localgenius install --claude
```
This command will:
- Create the MCP server launcher script
- Configure Claude Desktop automatically
- Back up existing configuration
- Show you next steps
Then restart Claude Desktop (Cmd+Q) and look for the MCP icon (🧩)!
## CLI Commands
```bash
# Initialize (first-time setup)
localgenius init
# Manage data sources
localgenius add-source /path/to/docs --name "My Docs" --index
localgenius remove-source /path/to/docs
localgenius list-sources
# Install integrations
localgenius install --claude # Auto-configure Claude Desktop
# Index documents
localgenius index # Index all sources
localgenius index --source /path # Index specific source
localgenius index --force # Force re-index
localgenius index --show # Show detailed index statistics
# Search and ask questions
localgenius search "your query"
localgenius ask "your question"
localgenius ask "question" --model gpt-4o --stream
# Run servers
localgenius serve # MCP server for Claude Desktop (default)
localgenius serve --admin # Web admin interface on http://localhost:3000
localgenius serve --mcp # MCP server explicitly
# Check status
localgenius status
```
## Usage in Claude Desktop
Once configured, you can ask Claude to use your documents:
- "Search my documents for information about X"
- "What do my files say about Y?"
- "Show me the LocalGenius status"
- "Based on my indexed documents, explain Z"
## Web Admin Interface
LocalGenius includes a modern React-based admin interface for managing your RAG system:
```bash
# Start the admin interface
localgenius serve --admin
# Opens:
# - Admin interface: http://localhost:3000
# - API backend: http://localhost:8765 (proxied through Next.js)
```
### Admin Features:
- 📊 **Dashboard** - View system status and statistics
- 📁 **Data Sources** - Add, remove, and manage document sources
- 🔍 **Search & Test** - Test semantic search and RAG queries
- ⚙️ **Settings** - Configure embedding, chunking, and MCP settings
- 📈 **Index Management** - View detailed index statistics and trigger re-indexing
## Available MCP Tools
LocalGenius provides three tools to Claude Desktop:
1. **search** - Semantic search through documents
- Find relevant content based on similarity
- Adjustable similarity threshold and result count
2. **ask** - RAG-powered Q&A using GPT-4o
- Get AI-generated answers based on your documents
- Includes source citations
3. **status** - Get index statistics
- View total documents, sources, and file types
## Requirements
- Python 3.8+
- OpenAI API key (for embeddings and RAG)
## Troubleshooting
### "Command not found: localgenius"
```bash
pip install -e .
```
### "OpenAI API key not configured"
```bash
export OPENAI_API_KEY="sk-..."
# Add to ~/.bashrc or ~/.zshrc to make permanent
```
### Claude Desktop doesn't show LocalGenius
1. Run `localgenius install --claude` to auto-configure
2. Make sure to completely restart Claude Desktop (Cmd+Q)
3. Check Console.app for error messages
### "No documents indexed"
```bash
localgenius status # Check what's configured
localgenius index # Run indexing
```
Raw data
{
"_id": null,
"home_page": null,
"name": "localgenius",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "mcp, rag, llm, claude, openai, vector-search, embeddings",
"author": "Marco Kotrotsos",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/5e/79/af97b194318d1ed286f76bf3714fc6995c47239a5511dcbd3e2c1aab6b17/localgenius-0.3.0.tar.gz",
"platform": null,
"description": "# LocalGenius\n\nA personal MCP (Model Context Protocol) server that acts as a RAG (Retrieval-Augmented Generation) source for your LLM. Works with Claude Desktop!\n\n## Features\n\n- \ud83d\udd0d Semantic search through your local documents\n- \ud83e\udd16 RAG-powered Q&A using OpenAI GPT-4o\n- \ud83d\udcc1 Local vector store using SQLite + FAISS\n- \ud83d\udda5\ufe0f Claude Desktop integration via MCP\n- \ud83c\udfa8 Rich CLI interface with onboarding wizard\n- \u26a1 Fast indexing and retrieval\n\n## Installation\n\n```bash\n# Install from PyPI (recommended)\npip install localgenius\n\n# Or install from source\ngit clone https://github.com/yourusername/localgenius.git\ncd localgenius\npip install -e .\n```\n\n## Quick Start\n\n```bash\n# 1. Set your OpenAI API key\nexport OPENAI_API_KEY=\"sk-...\"\n\n# 2. First run - interactive setup wizard\nlocalgenius init\n\n# 3. Add documents to index\nlocalgenius add-source /path/to/documents --name \"My Docs\"\n\n# 4. Index the documents\nlocalgenius index\n\n# 5. Test it out\nlocalgenius ask \"What are the main topics in my documents?\"\n```\n\n## Claude Desktop Integration\n\n### Automatic Setup (Recommended)\n\n```bash\n# Automatically configure Claude Desktop\nlocalgenius install --claude\n```\n\nThis command will:\n- Create the MCP server launcher script\n- Configure Claude Desktop automatically\n- Back up existing configuration\n- Show you next steps\n\nThen restart Claude Desktop (Cmd+Q) and look for the MCP icon (\ud83e\udde9)!\n\n## CLI Commands\n\n```bash\n# Initialize (first-time setup)\nlocalgenius init\n\n# Manage data sources\nlocalgenius add-source /path/to/docs --name \"My Docs\" --index\nlocalgenius remove-source /path/to/docs\nlocalgenius list-sources\n\n# Install integrations\nlocalgenius install --claude # Auto-configure Claude Desktop\n\n# Index documents\nlocalgenius index # Index all sources\nlocalgenius index --source /path # Index specific source\nlocalgenius index --force # Force re-index\nlocalgenius index --show # Show detailed index statistics\n\n# Search and ask questions\nlocalgenius search \"your query\"\nlocalgenius ask \"your question\"\nlocalgenius ask \"question\" --model gpt-4o --stream\n\n# Run servers\nlocalgenius serve # MCP server for Claude Desktop (default)\nlocalgenius serve --admin # Web admin interface on http://localhost:3000\nlocalgenius serve --mcp # MCP server explicitly\n\n# Check status\nlocalgenius status\n```\n\n## Usage in Claude Desktop\n\nOnce configured, you can ask Claude to use your documents:\n\n- \"Search my documents for information about X\"\n- \"What do my files say about Y?\"\n- \"Show me the LocalGenius status\"\n- \"Based on my indexed documents, explain Z\"\n\n## Web Admin Interface\n\nLocalGenius includes a modern React-based admin interface for managing your RAG system:\n\n```bash\n# Start the admin interface\nlocalgenius serve --admin\n\n# Opens:\n# - Admin interface: http://localhost:3000\n# - API backend: http://localhost:8765 (proxied through Next.js)\n```\n\n### Admin Features:\n- \ud83d\udcca **Dashboard** - View system status and statistics\n- \ud83d\udcc1 **Data Sources** - Add, remove, and manage document sources\n- \ud83d\udd0d **Search & Test** - Test semantic search and RAG queries\n- \u2699\ufe0f **Settings** - Configure embedding, chunking, and MCP settings\n- \ud83d\udcc8 **Index Management** - View detailed index statistics and trigger re-indexing\n\n## Available MCP Tools\n\nLocalGenius provides three tools to Claude Desktop:\n\n1. **search** - Semantic search through documents\n - Find relevant content based on similarity\n - Adjustable similarity threshold and result count\n\n2. **ask** - RAG-powered Q&A using GPT-4o\n - Get AI-generated answers based on your documents\n - Includes source citations\n\n3. **status** - Get index statistics\n - View total documents, sources, and file types\n\n## Requirements\n\n- Python 3.8+\n- OpenAI API key (for embeddings and RAG)\n\n## Troubleshooting\n\n### \"Command not found: localgenius\"\n```bash\npip install -e .\n```\n\n### \"OpenAI API key not configured\"\n```bash\nexport OPENAI_API_KEY=\"sk-...\"\n# Add to ~/.bashrc or ~/.zshrc to make permanent\n```\n\n### Claude Desktop doesn't show LocalGenius\n1. Run `localgenius install --claude` to auto-configure\n2. Make sure to completely restart Claude Desktop (Cmd+Q)\n3. Check Console.app for error messages\n\n### \"No documents indexed\"\n```bash\nlocalgenius status # Check what's configured\nlocalgenius index # Run indexing\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "A personal MCP server that acts as a RAG source for your LLM",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/marcokotrotsos/localgenius",
"Repository": "https://github.com/marcokotrotsos/localgenius"
},
"split_keywords": [
"mcp",
" rag",
" llm",
" claude",
" openai",
" vector-search",
" embeddings"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "11d506768379f2eb1921eaa7377e03efad2684d509568fe18c66e5600fc41a28",
"md5": "e955db26fab065cf4728cc8fe819394c",
"sha256": "e85834df042f3db37feb36b909a278280fab291b4dd88fe63cffee690b90644b"
},
"downloads": -1,
"filename": "localgenius-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e955db26fab065cf4728cc8fe819394c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 36942,
"upload_time": "2025-07-25T10:51:24",
"upload_time_iso_8601": "2025-07-25T10:51:24.867850Z",
"url": "https://files.pythonhosted.org/packages/11/d5/06768379f2eb1921eaa7377e03efad2684d509568fe18c66e5600fc41a28/localgenius-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5e79af97b194318d1ed286f76bf3714fc6995c47239a5511dcbd3e2c1aab6b17",
"md5": "87c2a4278dc86ded040da642aafe7e1f",
"sha256": "db350e29bd68d8c268e235fe5c15bc12a92c8e98a359819d12ccead1d035f5db"
},
"downloads": -1,
"filename": "localgenius-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "87c2a4278dc86ded040da642aafe7e1f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 33576,
"upload_time": "2025-07-25T10:51:26",
"upload_time_iso_8601": "2025-07-25T10:51:26.097990Z",
"url": "https://files.pythonhosted.org/packages/5e/79/af97b194318d1ed286f76bf3714fc6995c47239a5511dcbd3e2c1aab6b17/localgenius-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-25 10:51:26",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "marcokotrotsos",
"github_project": "localgenius",
"github_not_found": true,
"lcname": "localgenius"
}