# Qdrant MCP Server
A Model Context Protocol (MCP) server that provides semantic memory capabilities using Qdrant vector database with configurable embedding providers.
## Features
- **Multiple Embedding Providers**:
- OpenAI (text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002)
- Sentence Transformers (all-MiniLM-L6-v2, all-mpnet-base-v2, and more)
- **Semantic Search**: Store and retrieve information using vector similarity
- **Flexible Configuration**: Environment variables for all settings
- **MCP Tools**: Store, find, delete, and list operations
- **Metadata Support**: Attach custom metadata to stored content
## Installation
### Via uvx (Recommended for MCP)
The server is designed to be lightweight by default. When using OpenAI embeddings:
```bash
# For OpenAI embeddings (lightweight, no ML dependencies)
uvx qdrant-mcp
```
For local embeddings with Sentence Transformers:
```bash
# For local embeddings (includes torch and other ML libraries)
uvx --with sentence-transformers qdrant-mcp
```
### Via pip (Development)
```bash
# Clone the repository
git clone https://github.com/andrewlwn77/qdrant-mcp.git
cd qdrant-mcp
# Basic install (OpenAI embeddings only)
pip install -e .
# With local embeddings support
pip install -e . sentence-transformers
```
## Configuration
The server can be configured using environment variables:
### Required Environment Variables
- `EMBEDDING_PROVIDER`: Choose between `openai` or `sentence-transformers`
- `EMBEDDING_MODEL`: Model name for the chosen provider
- `OPENAI_API_KEY`: Required when using OpenAI embeddings
### Optional Environment Variables
- `QDRANT_URL`: Qdrant server URL (default: `http://localhost:6333`)
- `QDRANT_API_KEY`: Qdrant API key (optional)
- `COLLECTION_NAME`: Qdrant collection name (default: `mcp_memory`)
- `DEVICE`: Device for sentence transformers (default: auto-detect)
- `DEFAULT_LIMIT`: Default search results limit (default: 10)
- `SCORE_THRESHOLD`: Minimum similarity score (default: 0.0)
### Example Configuration
```bash
# OpenAI embeddings
export EMBEDDING_PROVIDER=openai
export EMBEDDING_MODEL=text-embedding-3-small
export OPENAI_API_KEY=your-api-key
# Sentence Transformers (local)
export EMBEDDING_PROVIDER=sentence-transformers
export EMBEDDING_MODEL=all-MiniLM-L6-v2
```
## Supported Embedding Models
### OpenAI Models
- `text-embedding-3-small` (1536 dimensions) - Default
- `text-embedding-3-large` (3072 dimensions)
- `text-embedding-ada-002` (1536 dimensions) - Legacy
### Sentence Transformers Models
- `all-MiniLM-L6-v2` (384 dimensions) - Fast and efficient
- `all-mpnet-base-v2` (768 dimensions) - Higher quality
- Any other Sentence Transformers model from Hugging Face
## Usage
### Starting the Server
```bash
# Development mode
python -m qdrant_mcp.server
# With MCP CLI
mcp dev src/qdrant_mcp/server.py
```
### MCP Tools
#### qdrant-store
Store content with semantic embeddings:
```json
{
"content": "The capital of France is Paris",
"metadata": "{\"category\": \"geography\", \"type\": \"fact\"}",
"id": "optional-custom-id"
}
```
#### qdrant-find
Search for relevant information:
```json
{
"query": "What is the capital of France?",
"limit": 5,
"filter": "{\"category\": \"geography\"}",
"score_threshold": 0.7
}
```
#### qdrant-delete
Delete stored items:
```json
{
"ids": "id1,id2,id3"
}
```
#### qdrant-list-collections
List all collections in Qdrant:
```json
{}
```
#### qdrant-collection-info
Get information about the current collection:
```json
{}
```
## Integration with Claude Desktop
Add to your Claude Desktop configuration:
### For OpenAI Embeddings (Lightweight)
```json
{
"mcpServers": {
"qdrant-memory": {
"command": "uvx",
"args": ["qdrant-mcp"],
"env": {
"EMBEDDING_PROVIDER": "openai",
"EMBEDDING_MODEL": "text-embedding-3-small",
"OPENAI_API_KEY": "your-api-key",
"QDRANT_URL": "https://your-instance.qdrant.io",
"QDRANT_API_KEY": "your-qdrant-api-key"
}
}
}
}
```
### For Local Embeddings (Sentence Transformers)
```json
{
"mcpServers": {
"qdrant-memory": {
"command": "uvx",
"args": ["--with", "sentence-transformers", "qdrant-mcp"],
"env": {
"EMBEDDING_PROVIDER": "sentence-transformers",
"EMBEDDING_MODEL": "all-MiniLM-L6-v2",
"QDRANT_URL": "https://your-instance.qdrant.io",
"QDRANT_API_KEY": "your-qdrant-api-key"
}
}
}
}
```
## Development
### Running Tests
```bash
# Install dev dependencies
pip install -e ".[dev]"
# Run tests
pytest
# Type checking
mypy src/
# Linting
ruff check src/
```
### Project Structure
```
qdrant-mcp/
├── src/
│ └── qdrant_mcp/
│ ├── __init__.py
│ ├── server.py # MCP server implementation
│ ├── settings.py # Configuration management
│ ├── qdrant_client.py # Qdrant operations
│ └── embeddings/
│ ├── base.py # Abstract base class
│ ├── factory.py # Provider factory
│ ├── openai.py # OpenAI implementation
│ └── sentence_transformers.py # ST implementation
└── tests/
└── test_server.py
```
## Docker Support
```dockerfile
FROM python:3.10-slim
WORKDIR /app
COPY . .
RUN pip install -e .
CMD ["python", "-m", "qdrant_mcp.server"]
```
## License
Apache License 2.0
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Raw data
{
"_id": null,
"home_page": null,
"name": "qdrant-mcp",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "embeddings, mcp, qdrant, semantic-search, vector-search",
"author": null,
"author_email": "Andrew Lewin <andrew@example.com>",
"download_url": "https://files.pythonhosted.org/packages/d2/b3/a5d08f6014cb17e43b21ef4b9d750411d12f64bdd2b52c2e54547e8a16a1/qdrant_mcp-0.2.0.tar.gz",
"platform": null,
"description": "# Qdrant MCP Server\n\nA Model Context Protocol (MCP) server that provides semantic memory capabilities using Qdrant vector database with configurable embedding providers.\n\n## Features\n\n- **Multiple Embedding Providers**:\n - OpenAI (text-embedding-3-small, text-embedding-3-large, text-embedding-ada-002)\n - Sentence Transformers (all-MiniLM-L6-v2, all-mpnet-base-v2, and more)\n- **Semantic Search**: Store and retrieve information using vector similarity\n- **Flexible Configuration**: Environment variables for all settings\n- **MCP Tools**: Store, find, delete, and list operations\n- **Metadata Support**: Attach custom metadata to stored content\n\n## Installation\n\n### Via uvx (Recommended for MCP)\n\nThe server is designed to be lightweight by default. When using OpenAI embeddings:\n\n```bash\n# For OpenAI embeddings (lightweight, no ML dependencies)\nuvx qdrant-mcp\n```\n\nFor local embeddings with Sentence Transformers:\n\n```bash\n# For local embeddings (includes torch and other ML libraries)\nuvx --with sentence-transformers qdrant-mcp\n```\n\n### Via pip (Development)\n\n```bash\n# Clone the repository\ngit clone https://github.com/andrewlwn77/qdrant-mcp.git\ncd qdrant-mcp\n\n# Basic install (OpenAI embeddings only)\npip install -e .\n\n# With local embeddings support\npip install -e . sentence-transformers\n```\n\n## Configuration\n\nThe server can be configured using environment variables:\n\n### Required Environment Variables\n\n- `EMBEDDING_PROVIDER`: Choose between `openai` or `sentence-transformers`\n- `EMBEDDING_MODEL`: Model name for the chosen provider\n- `OPENAI_API_KEY`: Required when using OpenAI embeddings\n\n### Optional Environment Variables\n\n- `QDRANT_URL`: Qdrant server URL (default: `http://localhost:6333`)\n- `QDRANT_API_KEY`: Qdrant API key (optional)\n- `COLLECTION_NAME`: Qdrant collection name (default: `mcp_memory`)\n- `DEVICE`: Device for sentence transformers (default: auto-detect)\n- `DEFAULT_LIMIT`: Default search results limit (default: 10)\n- `SCORE_THRESHOLD`: Minimum similarity score (default: 0.0)\n\n### Example Configuration\n\n```bash\n# OpenAI embeddings\nexport EMBEDDING_PROVIDER=openai\nexport EMBEDDING_MODEL=text-embedding-3-small\nexport OPENAI_API_KEY=your-api-key\n\n# Sentence Transformers (local)\nexport EMBEDDING_PROVIDER=sentence-transformers\nexport EMBEDDING_MODEL=all-MiniLM-L6-v2\n```\n\n## Supported Embedding Models\n\n### OpenAI Models\n- `text-embedding-3-small` (1536 dimensions) - Default\n- `text-embedding-3-large` (3072 dimensions)\n- `text-embedding-ada-002` (1536 dimensions) - Legacy\n\n### Sentence Transformers Models\n- `all-MiniLM-L6-v2` (384 dimensions) - Fast and efficient\n- `all-mpnet-base-v2` (768 dimensions) - Higher quality\n- Any other Sentence Transformers model from Hugging Face\n\n## Usage\n\n### Starting the Server\n\n```bash\n# Development mode\npython -m qdrant_mcp.server\n\n# With MCP CLI\nmcp dev src/qdrant_mcp/server.py\n```\n\n### MCP Tools\n\n#### qdrant-store\nStore content with semantic embeddings:\n```json\n{\n \"content\": \"The capital of France is Paris\",\n \"metadata\": \"{\\\"category\\\": \\\"geography\\\", \\\"type\\\": \\\"fact\\\"}\",\n \"id\": \"optional-custom-id\"\n}\n```\n\n#### qdrant-find\nSearch for relevant information:\n```json\n{\n \"query\": \"What is the capital of France?\",\n \"limit\": 5,\n \"filter\": \"{\\\"category\\\": \\\"geography\\\"}\",\n \"score_threshold\": 0.7\n}\n```\n\n#### qdrant-delete\nDelete stored items:\n```json\n{\n \"ids\": \"id1,id2,id3\"\n}\n```\n\n#### qdrant-list-collections\nList all collections in Qdrant:\n```json\n{}\n```\n\n#### qdrant-collection-info\nGet information about the current collection:\n```json\n{}\n```\n\n## Integration with Claude Desktop\n\nAdd to your Claude Desktop configuration:\n\n### For OpenAI Embeddings (Lightweight)\n```json\n{\n \"mcpServers\": {\n \"qdrant-memory\": {\n \"command\": \"uvx\",\n \"args\": [\"qdrant-mcp\"],\n \"env\": {\n \"EMBEDDING_PROVIDER\": \"openai\",\n \"EMBEDDING_MODEL\": \"text-embedding-3-small\",\n \"OPENAI_API_KEY\": \"your-api-key\",\n \"QDRANT_URL\": \"https://your-instance.qdrant.io\",\n \"QDRANT_API_KEY\": \"your-qdrant-api-key\"\n }\n }\n }\n}\n```\n\n### For Local Embeddings (Sentence Transformers)\n```json\n{\n \"mcpServers\": {\n \"qdrant-memory\": {\n \"command\": \"uvx\",\n \"args\": [\"--with\", \"sentence-transformers\", \"qdrant-mcp\"],\n \"env\": {\n \"EMBEDDING_PROVIDER\": \"sentence-transformers\",\n \"EMBEDDING_MODEL\": \"all-MiniLM-L6-v2\",\n \"QDRANT_URL\": \"https://your-instance.qdrant.io\",\n \"QDRANT_API_KEY\": \"your-qdrant-api-key\"\n }\n }\n }\n}\n```\n\n## Development\n\n### Running Tests\n\n```bash\n# Install dev dependencies\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Type checking\nmypy src/\n\n# Linting\nruff check src/\n```\n\n### Project Structure\n\n```\nqdrant-mcp/\n\u251c\u2500\u2500 src/\n\u2502 \u2514\u2500\u2500 qdrant_mcp/\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 server.py # MCP server implementation\n\u2502 \u251c\u2500\u2500 settings.py # Configuration management\n\u2502 \u251c\u2500\u2500 qdrant_client.py # Qdrant operations\n\u2502 \u2514\u2500\u2500 embeddings/\n\u2502 \u251c\u2500\u2500 base.py # Abstract base class\n\u2502 \u251c\u2500\u2500 factory.py # Provider factory\n\u2502 \u251c\u2500\u2500 openai.py # OpenAI implementation\n\u2502 \u2514\u2500\u2500 sentence_transformers.py # ST implementation\n\u2514\u2500\u2500 tests/\n \u2514\u2500\u2500 test_server.py\n```\n\n## Docker Support\n\n```dockerfile\nFROM python:3.10-slim\n\nWORKDIR /app\nCOPY . .\nRUN pip install -e .\n\nCMD [\"python\", \"-m\", \"qdrant_mcp.server\"]\n```\n\n## License\n\nApache License 2.0\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Qdrant MCP server with configurable embedding providers",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/andrewlwn77/qdrant-mcp",
"Issues": "https://github.com/andrewlwn77/qdrant-mcp/issues",
"Repository": "https://github.com/andrewlwn77/qdrant-mcp"
},
"split_keywords": [
"embeddings",
" mcp",
" qdrant",
" semantic-search",
" vector-search"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "a266abc2ec3efe5e03abbc851ac6b75048a8dda45bef20365532230b3fe23ff4",
"md5": "b662f6d2445ce6dc35a8b18ad799fea0",
"sha256": "37458892a934fe9331c193720fac4c450569ab0d7a25499ea26cabf03a560263"
},
"downloads": -1,
"filename": "qdrant_mcp-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b662f6d2445ce6dc35a8b18ad799fea0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 17463,
"upload_time": "2025-07-13T01:46:21",
"upload_time_iso_8601": "2025-07-13T01:46:21.633728Z",
"url": "https://files.pythonhosted.org/packages/a2/66/abc2ec3efe5e03abbc851ac6b75048a8dda45bef20365532230b3fe23ff4/qdrant_mcp-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "d2b3a5d08f6014cb17e43b21ef4b9d750411d12f64bdd2b52c2e54547e8a16a1",
"md5": "5042782ca241d779cd3f243a71e5d6e1",
"sha256": "ab7135d417138260ff3144d3218b14b541be12aa553e74263c21ab5bd902c5f4"
},
"downloads": -1,
"filename": "qdrant_mcp-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "5042782ca241d779cd3f243a71e5d6e1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 15236,
"upload_time": "2025-07-13T01:46:22",
"upload_time_iso_8601": "2025-07-13T01:46:22.818886Z",
"url": "https://files.pythonhosted.org/packages/d2/b3/a5d08f6014cb17e43b21ef4b9d750411d12f64bdd2b52c2e54547e8a16a1/qdrant_mcp-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-13 01:46:22",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "andrewlwn77",
"github_project": "qdrant-mcp",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "qdrant-mcp"
}