# DeepSeek MCP Client
Un cliente de Python para conectar modelos de lenguaje DeepSeek con servidores del Protocolo de Contexto de Modelo (MCP), permitiendo una integración perfecta entre modelos de IA y herramientas externas.
## Características
- **Soporte de Múltiples Transportes**: Conexiones HTTP/HTTPS, STDIO y en memoria
- **Configuración Flexible**: Soporte para múltiples servidores MCP simultáneamente
- **Monitoreo de Progreso**: Seguimiento en tiempo real de operaciones de larga duración
- **Descubrimiento Automático de Herramientas**: Detección y uso dinámico de herramientas disponibles
- **Logging Configurable**: Silencioso por defecto, con logging detallado opcional
- **Recuperación de Errores**: Manejo robusto de errores y gestión de conexiones
- **Arquitectura Modular**: Código base limpio y mantenible con responsabilidades separadas
## Instalación
### Desde PyPI (Recomendado)
```bash
pip install deepseek-mcp-client
```
### Desde el Código Fuente
```bash
git clone https://github.com/CarlosMaroRuiz/deepseek-mcp-client.git
cd deepseek-mcp-client
pip install -e .
```
## Inicio Rápido
### 1. Configurar tu clave API
```bash
export DEEPSEEK_API_KEY="tu_clave_api_aqui"
```
### 2. Uso básico
```python
from deepseek_mcp_client import DeepSeekClient
import asyncio
# Cliente simple sin servidores MCP
client = DeepSeekClient(model='deepseek-chat')
async def main():
result = await client.execute('Hola, ¿cómo estás?')
print(result.output)
await client.close()
asyncio.run(main())
```
## Configuración
### Servidores MCP HTTP
```python
from deepseek_mcp_client import DeepSeekClient
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='Eres un asistente útil con acceso a herramientas externas.',
mcp_servers=[
'http://localhost:8000/mcp/', # URL simple
{
'url': 'http://localhost:8001/mcp/',
'headers': {'Authorization': 'Bearer tu-token'},
'timeout': 30.0
}
]
)
```
### Servidores MCP STDIO
```python
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[
{
'command': 'python',
'args': ['mcp_server.py'],
'env': {'LOG_LEVEL': 'INFO'},
'cwd': './servers/',
'timeout': 60.0
}
]
)
```
### Configuración Mixta
```python
from fastmcp import FastMCP
# Servidor en memoria
calculator = FastMCP("Calculator")
@calculator.tool
def add(a: float, b: float) -> float:
"""Sumar dos números"""
return a + b
# Tipos de transporte mixtos
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[
'http://localhost:8000/mcp/', # HTTP
{ # STDIO
'command': 'python',
'args': ['local_server.py']
},
calculator # En memoria
]
)
```
## Control de Logging
### Operación Silenciosa (Por Defecto)
```python
# Completamente silencioso - sin logs
client = DeepSeekClient(model='deepseek-chat')
```
### Habilitar Logging
```python
client = DeepSeekClient(
model='deepseek-chat',
enable_logging=True, # Mostrar operaciones del cliente
enable_progress=True, # Mostrar actualizaciones de progreso
log_level="INFO" # Establecer nivel de logging
)
```
### Logging con Colores
```python
from deepseek_mcp_client import setup_colored_logging
# Habilitar salida con colores
setup_colored_logging("INFO")
client = DeepSeekClient(
model='deepseek-chat',
enable_logging=True
)
```
## Uso Avanzado
### Configuración Personalizada de Servidor
```python
from deepseek_mcp_client import MCPServerConfig
# Configuración detallada de servidor HTTP
http_config = MCPServerConfig(
url='http://localhost:8000/mcp/',
headers={'X-API-Key': 'tu-clave'},
timeout=45.0,
transport_type='http'
)
# Configuración detallada de servidor STDIO
stdio_config = MCPServerConfig(
command='node',
args=['server.js', '--port', '3000'],
env={'NODE_ENV': 'production'},
cwd='./node-server/',
timeout=30.0
)
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[http_config, stdio_config]
)
```
### Manejo de Errores
```python
async def ejecucion_robusta():
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=['http://localhost:8000/mcp/']
)
try:
result = await client.execute('Tu consulta aquí')
if result.success:
print(f"Éxito: {result.output}")
print(f"Herramientas usadas: {result.tools_used}")
print(f"Duración: {result.metadata.get('duration'):.2f}s")
else:
print(f"Error: {result.error}")
except Exception as e:
print(f"Error crítico: {e}")
finally:
await client.close()
```
### Trabajando con Resultados
```python
result = await client.execute('Analizar la base de datos')
# Verificar estado de ejecución
if result.success:
print("Ejecución completada exitosamente")
# Acceder a datos de respuesta
print(f"Respuesta: {result.output}")
print(f"ID de ejecución: {result.execution_id}")
print(f"Herramientas usadas: {result.tools_used}")
# Acceder a metadatos
metadata = result.metadata
print(f"Duración: {metadata.get('duration')}s")
print(f"Servidores conectados: {metadata.get('servers_connected')}")
print(f"Herramientas disponibles: {metadata.get('tools_available')}")
# Convertir a diccionario para serialización
result_dict = result.to_dict()
```
## Casos de Uso Comunes
### Análisis de Base de Datos
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='Eres un analista de base de datos con experiencia en SQL.',
mcp_servers=['http://localhost:8000/mcp/']
)
result = await client.execute('''
Analiza la estructura de la tabla user_orders y proporciona información sobre:
1. Esquema de tabla y relaciones
2. Patrones de distribución de datos
3. Sugerencias de optimización de rendimiento
''')
```
### Integración de E-commerce
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='Eres un asistente de e-commerce.',
mcp_servers=[
'http://localhost:8000/mcp/', # MCP de búsqueda de productos
'http://localhost:8001/mcp/' # MCP de comparación de precios
]
)
result = await client.execute(
'Encuentra las mejores laptops gaming por menos de $1500 y compara precios entre plataformas'
)
```
### Generación de Documentos
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='Eres un especialista en generación de documentos.',
mcp_servers=[
{
'command': 'python',
'args': ['latex_server.py'],
'env': {'DOCUMENT_TEMPLATE': 'professional'}
}
]
)
result = await client.execute('''
Genera un reporte técnico sobre nuestro rendimiento Q4 con:
- Resumen ejecutivo
- Gráficos de métricas de rendimiento
- Sección de recomendaciones
Exportar en formato PDF
''')
```
## Referencia de API
### DeepSeekClient
```python
DeepSeekClient(
model: str, # Nombre del modelo DeepSeek (requerido)
system_prompt: str = None, # Prompt del sistema para el modelo
mcp_servers: List = None, # Configuraciones de servidores MCP
enable_logging: bool = False, # Habilitar logging del cliente
enable_progress: bool = False, # Habilitar monitoreo de progreso
log_level: str = "INFO" # Nivel de logging
)
```
### ClientResult
```python
@dataclass
class ClientResult:
output: str # Respuesta del modelo
success: bool # Estado de ejecución
execution_id: str # Identificador único de ejecución
timestamp: datetime # Marca de tiempo de ejecución
tools_used: List[str] # Lista de herramientas ejecutadas
metadata: Dict[str, Any] # Metadatos de ejecución
raw_response: Any = None # Respuesta cruda del modelo
error: str = None # Mensaje de error si falló
```
### MCPServerConfig
```python
@dataclass
class MCPServerConfig:
# Configuración HTTP
url: str = None
headers: Dict[str, str] = None
# Configuración STDIO
command: str = None
args: List[str] = None
env: Dict[str, str] = None
cwd: str = None
# Configuración general
transport_type: str = None # 'http', 'stdio', 'memory'
timeout: float = 30.0
keep_alive: bool = True
```
## Variables de Entorno
```bash
# Requerido
DEEPSEEK_API_KEY=tu_clave_api_deepseek
# Opcional
MCP_LOG_LEVEL=INFO
MCP_TIMEOUT=30
```
## Servidores MCP Compatibles
### Servidores de Producción
- **LaTeX MCP**: Generación de documentos y creación de PDF
- **Database MCP**: Operaciones de base de datos SQL y análisis
- **Weather MCP**: Datos meteorológicos y pronósticos
- **Web Search MCP**: Búsqueda en internet y recuperación de contenido
### Servidores de Desarrollo
- **FastMCP**: Servidores en memoria para testing y desarrollo
- **Local File MCP**: Operaciones del sistema de archivos
- **Calculator MCP**: Operaciones matemáticas
## Servidores MCP Que Te Podrían Interesar
- **[MCP SQL](https://github.com/CarlosMaroRuiz/mcp_sql)**: Operaciones avanzadas de base de datos SQL con aprendizaje de consultas, análisis de esquemas y características de optimización de rendimiento
---
# DeepSeek MCP Client
A Python client for connecting DeepSeek language models with Model Context Protocol (MCP) servers, enabling seamless integration between AI models and external tools.
## Features
- **Multiple Transport Support**: HTTP/HTTPS, STDIO, and in-memory connections
- **Flexible Configuration**: Support for multiple MCP servers simultaneously
- **Progress Monitoring**: Real-time tracking of long-running operations
- **Automatic Tool Discovery**: Dynamic detection and usage of available tools
- **Configurable Logging**: Silent by default, with optional detailed logging
- **Error Recovery**: Robust error handling and connection management
- **Modular Architecture**: Clean, maintainable codebase with separated concerns
## Installation
### From PyPI (Recommended)
```bash
pip install deepseek-mcp-client
```
### From Source
```bash
git clone https://github.com/CarlosMaroRuiz/deepseek-mcp-client.git
cd deepseek-mcp-client
pip install -e .
```
## Quick Start
### 1. Set up your API key
```bash
export DEEPSEEK_API_KEY="your_api_key_here"
```
### 2. Basic usage
```python
from deepseek_mcp_client import DeepSeekClient
import asyncio
# Simple client without MCP servers
client = DeepSeekClient(model='deepseek-chat')
async def main():
result = await client.execute('Hello, how are you?')
print(result.output)
await client.close()
asyncio.run(main())
```
## Configuration
### HTTP MCP Servers
```python
from deepseek_mcp_client import DeepSeekClient
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='You are a helpful assistant with access to external tools.',
mcp_servers=[
'http://localhost:8000/mcp/', # Simple URL
{
'url': 'http://localhost:8001/mcp/',
'headers': {'Authorization': 'Bearer your-token'},
'timeout': 30.0
}
]
)
```
### STDIO MCP Servers
```python
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[
{
'command': 'python',
'args': ['mcp_server.py'],
'env': {'LOG_LEVEL': 'INFO'},
'cwd': './servers/',
'timeout': 60.0
}
]
)
```
### Mixed Configuration
```python
from fastmcp import FastMCP
# In-memory server
calculator = FastMCP("Calculator")
@calculator.tool
def add(a: float, b: float) -> float:
"""Add two numbers"""
return a + b
# Mixed transport types
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[
'http://localhost:8000/mcp/', # HTTP
{ # STDIO
'command': 'python',
'args': ['local_server.py']
},
calculator # In-memory
]
)
```
## Logging Control
### Silent Operation (Default)
```python
# Completely silent - no logs
client = DeepSeekClient(model='deepseek-chat')
```
### Enable Logging
```python
client = DeepSeekClient(
model='deepseek-chat',
enable_logging=True, # Show client operations
enable_progress=True, # Show progress updates
log_level="INFO" # Set logging level
)
```
### Colored Logging
```python
from deepseek_mcp_client import setup_colored_logging
# Enable colored output
setup_colored_logging("INFO")
client = DeepSeekClient(
model='deepseek-chat',
enable_logging=True
)
```
## Advanced Usage
### Custom Server Configuration
```python
from deepseek_mcp_client import MCPServerConfig
# Detailed HTTP server configuration
http_config = MCPServerConfig(
url='http://localhost:8000/mcp/',
headers={'X-API-Key': 'your-key'},
timeout=45.0,
transport_type='http'
)
# Detailed STDIO server configuration
stdio_config = MCPServerConfig(
command='node',
args=['server.js', '--port', '3000'],
env={'NODE_ENV': 'production'},
cwd='./node-server/',
timeout=30.0
)
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=[http_config, stdio_config]
)
```
### Error Handling
```python
async def robust_execution():
client = DeepSeekClient(
model='deepseek-chat',
mcp_servers=['http://localhost:8000/mcp/']
)
try:
result = await client.execute('Your query here')
if result.success:
print(f"Success: {result.output}")
print(f"Tools used: {result.tools_used}")
print(f"Duration: {result.metadata.get('duration'):.2f}s")
else:
print(f"Error: {result.error}")
except Exception as e:
print(f"Critical error: {e}")
finally:
await client.close()
```
### Working with Results
```python
result = await client.execute('Analyze the database')
# Check execution status
if result.success:
print("Execution completed successfully")
# Access response data
print(f"Response: {result.output}")
print(f"Execution ID: {result.execution_id}")
print(f"Tools used: {result.tools_used}")
# Access metadata
metadata = result.metadata
print(f"Duration: {metadata.get('duration')}s")
print(f"Servers connected: {metadata.get('servers_connected')}")
print(f"Tools available: {metadata.get('tools_available')}")
# Convert to dictionary for serialization
result_dict = result.to_dict()
```
## Common Use Cases
### Database Analysis
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='You are a database analyst with SQL expertise.',
mcp_servers=['http://localhost:8000/mcp/']
)
result = await client.execute('''
Analyze the user_orders table structure and provide insights on:
1. Table schema and relationships
2. Data distribution patterns
3. Performance optimization suggestions
''')
```
### E-commerce Integration
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='You are an e-commerce assistant.',
mcp_servers=[
'http://localhost:8000/mcp/', # Product search MCP
'http://localhost:8001/mcp/' # Price comparison MCP
]
)
result = await client.execute(
'Find the best gaming laptops under $1500 and compare prices across platforms'
)
```
### Document Generation
```python
client = DeepSeekClient(
model='deepseek-chat',
system_prompt='You are a document generation specialist.',
mcp_servers=[
{
'command': 'python',
'args': ['latex_server.py'],
'env': {'DOCUMENT_TEMPLATE': 'professional'}
}
]
)
result = await client.execute('''
Generate a technical report on our Q4 performance with:
- Executive summary
- Performance metrics charts
- Recommendations section
Export as PDF format
''')
```
## API Reference
### DeepSeekClient
```python
DeepSeekClient(
model: str, # DeepSeek model name (required)
system_prompt: str = None, # System prompt for the model
mcp_servers: List = None, # MCP server configurations
enable_logging: bool = False, # Enable client logging
enable_progress: bool = False, # Enable progress monitoring
log_level: str = "INFO" # Logging level
)
```
### ClientResult
```python
@dataclass
class ClientResult:
output: str # Model response
success: bool # Execution status
execution_id: str # Unique execution identifier
timestamp: datetime # Execution timestamp
tools_used: List[str] # List of tools executed
metadata: Dict[str, Any] # Execution metadata
raw_response: Any = None # Raw model response
error: str = None # Error message if failed
```
### MCPServerConfig
```python
@dataclass
class MCPServerConfig:
# HTTP configuration
url: str = None
headers: Dict[str, str] = None
# STDIO configuration
command: str = None
args: List[str] = None
env: Dict[str, str] = None
cwd: str = None
# General configuration
transport_type: str = None # 'http', 'stdio', 'memory'
timeout: float = 30.0
keep_alive: bool = True
```
## Environment Variables
```bash
# Required
DEEPSEEK_API_KEY=your_deepseek_api_key
# Optional
MCP_LOG_LEVEL=INFO
MCP_TIMEOUT=30
```
## Compatible MCP Servers
### Production Servers
- **LaTeX MCP**: Document generation and PDF creation
- **Database MCP**: SQL database operations and analysis
- **Weather MCP**: Weather data and forecasting
- **Web Search MCP**: Internet search and content retrieval
### Development Servers
- **FastMCP**: In-memory servers for testing and development
- **Local File MCP**: File system operations
- **Calculator MCP**: Mathematical operations
## MCP Servers That Might Interest You
- **[MCP SQL](https://github.com/CarlosMaroRuiz/mcp_sql)**: Advanced SQL database operations with query learning, schema analysis, and performance optimization features
Raw data
{
"_id": null,
"home_page": "https://github.com/CarlosMaroRuiz/deepseek-mcp-client",
"name": "deepseek-mcp-client",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "Carlos Ruiz <car06ma15@gmail.com>",
"keywords": "deepseek, mcp, client, ai, llm, model context protocol, tools, agent",
"author": "Carlos Ruiz",
"author_email": "Carlos Ruiz <car06ma15@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/55/6c/d5380e2198c738c1981fe1ce1271a76926cc417432caba09e16b1fa337b7/deepseek_mcp_client-2.0.0.tar.gz",
"platform": null,
"description": "# DeepSeek MCP Client\r\n\r\nUn cliente de Python para conectar modelos de lenguaje DeepSeek con servidores del Protocolo de Contexto de Modelo (MCP), permitiendo una integraci\u00f3n perfecta entre modelos de IA y herramientas externas.\r\n\r\n## Caracter\u00edsticas\r\n\r\n- **Soporte de M\u00faltiples Transportes**: Conexiones HTTP/HTTPS, STDIO y en memoria\r\n- **Configuraci\u00f3n Flexible**: Soporte para m\u00faltiples servidores MCP simult\u00e1neamente \r\n- **Monitoreo de Progreso**: Seguimiento en tiempo real de operaciones de larga duraci\u00f3n\r\n- **Descubrimiento Autom\u00e1tico de Herramientas**: Detecci\u00f3n y uso din\u00e1mico de herramientas disponibles\r\n- **Logging Configurable**: Silencioso por defecto, con logging detallado opcional\r\n- **Recuperaci\u00f3n de Errores**: Manejo robusto de errores y gesti\u00f3n de conexiones\r\n- **Arquitectura Modular**: C\u00f3digo base limpio y mantenible con responsabilidades separadas\r\n\r\n## Instalaci\u00f3n\r\n\r\n### Desde PyPI (Recomendado)\r\n\r\n```bash\r\npip install deepseek-mcp-client\r\n```\r\n\r\n### Desde el C\u00f3digo Fuente\r\n\r\n```bash\r\ngit clone https://github.com/CarlosMaroRuiz/deepseek-mcp-client.git\r\ncd deepseek-mcp-client\r\npip install -e .\r\n```\r\n\r\n## Inicio R\u00e1pido\r\n\r\n### 1. Configurar tu clave API\r\n\r\n```bash\r\nexport DEEPSEEK_API_KEY=\"tu_clave_api_aqui\"\r\n```\r\n\r\n### 2. Uso b\u00e1sico\r\n\r\n```python\r\nfrom deepseek_mcp_client import DeepSeekClient\r\nimport asyncio\r\n\r\n# Cliente simple sin servidores MCP\r\nclient = DeepSeekClient(model='deepseek-chat')\r\n\r\nasync def main():\r\n result = await client.execute('Hola, \u00bfc\u00f3mo est\u00e1s?')\r\n print(result.output)\r\n await client.close()\r\n\r\nasyncio.run(main())\r\n```\r\n\r\n## Configuraci\u00f3n\r\n\r\n### Servidores MCP HTTP\r\n\r\n```python\r\nfrom deepseek_mcp_client import DeepSeekClient\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='Eres un asistente \u00fatil con acceso a herramientas externas.',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # URL simple\r\n {\r\n 'url': 'http://localhost:8001/mcp/',\r\n 'headers': {'Authorization': 'Bearer tu-token'},\r\n 'timeout': 30.0\r\n }\r\n ]\r\n)\r\n```\r\n\r\n### Servidores MCP STDIO\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[\r\n {\r\n 'command': 'python',\r\n 'args': ['mcp_server.py'],\r\n 'env': {'LOG_LEVEL': 'INFO'},\r\n 'cwd': './servers/',\r\n 'timeout': 60.0\r\n }\r\n ]\r\n)\r\n```\r\n\r\n### Configuraci\u00f3n Mixta\r\n\r\n```python\r\nfrom fastmcp import FastMCP\r\n\r\n# Servidor en memoria\r\ncalculator = FastMCP(\"Calculator\")\r\n\r\n@calculator.tool\r\ndef add(a: float, b: float) -> float:\r\n \"\"\"Sumar dos n\u00fameros\"\"\"\r\n return a + b\r\n\r\n# Tipos de transporte mixtos\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # HTTP\r\n { # STDIO\r\n 'command': 'python',\r\n 'args': ['local_server.py']\r\n },\r\n calculator # En memoria\r\n ]\r\n)\r\n```\r\n\r\n## Control de Logging\r\n\r\n### Operaci\u00f3n Silenciosa (Por Defecto)\r\n\r\n```python\r\n# Completamente silencioso - sin logs\r\nclient = DeepSeekClient(model='deepseek-chat')\r\n```\r\n\r\n### Habilitar Logging\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n enable_logging=True, # Mostrar operaciones del cliente\r\n enable_progress=True, # Mostrar actualizaciones de progreso\r\n log_level=\"INFO\" # Establecer nivel de logging\r\n)\r\n```\r\n\r\n### Logging con Colores\r\n\r\n```python\r\nfrom deepseek_mcp_client import setup_colored_logging\r\n\r\n# Habilitar salida con colores\r\nsetup_colored_logging(\"INFO\")\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n enable_logging=True\r\n)\r\n```\r\n\r\n## Uso Avanzado\r\n\r\n### Configuraci\u00f3n Personalizada de Servidor\r\n\r\n```python\r\nfrom deepseek_mcp_client import MCPServerConfig\r\n\r\n# Configuraci\u00f3n detallada de servidor HTTP\r\nhttp_config = MCPServerConfig(\r\n url='http://localhost:8000/mcp/',\r\n headers={'X-API-Key': 'tu-clave'},\r\n timeout=45.0,\r\n transport_type='http'\r\n)\r\n\r\n# Configuraci\u00f3n detallada de servidor STDIO\r\nstdio_config = MCPServerConfig(\r\n command='node',\r\n args=['server.js', '--port', '3000'],\r\n env={'NODE_ENV': 'production'},\r\n cwd='./node-server/',\r\n timeout=30.0\r\n)\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[http_config, stdio_config]\r\n)\r\n```\r\n\r\n### Manejo de Errores\r\n\r\n```python\r\nasync def ejecucion_robusta():\r\n client = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=['http://localhost:8000/mcp/']\r\n )\r\n \r\n try:\r\n result = await client.execute('Tu consulta aqu\u00ed')\r\n \r\n if result.success:\r\n print(f\"\u00c9xito: {result.output}\")\r\n print(f\"Herramientas usadas: {result.tools_used}\")\r\n print(f\"Duraci\u00f3n: {result.metadata.get('duration'):.2f}s\")\r\n else:\r\n print(f\"Error: {result.error}\")\r\n \r\n except Exception as e:\r\n print(f\"Error cr\u00edtico: {e}\")\r\n finally:\r\n await client.close()\r\n```\r\n\r\n### Trabajando con Resultados\r\n\r\n```python\r\nresult = await client.execute('Analizar la base de datos')\r\n\r\n# Verificar estado de ejecuci\u00f3n\r\nif result.success:\r\n print(\"Ejecuci\u00f3n completada exitosamente\")\r\n \r\n# Acceder a datos de respuesta\r\nprint(f\"Respuesta: {result.output}\")\r\nprint(f\"ID de ejecuci\u00f3n: {result.execution_id}\")\r\nprint(f\"Herramientas usadas: {result.tools_used}\")\r\n\r\n# Acceder a metadatos\r\nmetadata = result.metadata\r\nprint(f\"Duraci\u00f3n: {metadata.get('duration')}s\")\r\nprint(f\"Servidores conectados: {metadata.get('servers_connected')}\")\r\nprint(f\"Herramientas disponibles: {metadata.get('tools_available')}\")\r\n\r\n# Convertir a diccionario para serializaci\u00f3n\r\nresult_dict = result.to_dict()\r\n```\r\n\r\n## Casos de Uso Comunes\r\n\r\n### An\u00e1lisis de Base de Datos\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='Eres un analista de base de datos con experiencia en SQL.',\r\n mcp_servers=['http://localhost:8000/mcp/']\r\n)\r\n\r\nresult = await client.execute('''\r\nAnaliza la estructura de la tabla user_orders y proporciona informaci\u00f3n sobre:\r\n1. Esquema de tabla y relaciones\r\n2. Patrones de distribuci\u00f3n de datos\r\n3. Sugerencias de optimizaci\u00f3n de rendimiento\r\n''')\r\n```\r\n\r\n### Integraci\u00f3n de E-commerce\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='Eres un asistente de e-commerce.',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # MCP de b\u00fasqueda de productos\r\n 'http://localhost:8001/mcp/' # MCP de comparaci\u00f3n de precios\r\n ]\r\n)\r\n\r\nresult = await client.execute(\r\n 'Encuentra las mejores laptops gaming por menos de $1500 y compara precios entre plataformas'\r\n)\r\n```\r\n\r\n### Generaci\u00f3n de Documentos\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='Eres un especialista en generaci\u00f3n de documentos.',\r\n mcp_servers=[\r\n {\r\n 'command': 'python',\r\n 'args': ['latex_server.py'],\r\n 'env': {'DOCUMENT_TEMPLATE': 'professional'}\r\n }\r\n ]\r\n)\r\n\r\nresult = await client.execute('''\r\nGenera un reporte t\u00e9cnico sobre nuestro rendimiento Q4 con:\r\n- Resumen ejecutivo\r\n- Gr\u00e1ficos de m\u00e9tricas de rendimiento\r\n- Secci\u00f3n de recomendaciones\r\nExportar en formato PDF\r\n''')\r\n```\r\n\r\n## Referencia de API\r\n\r\n### DeepSeekClient\r\n\r\n```python\r\nDeepSeekClient(\r\n model: str, # Nombre del modelo DeepSeek (requerido)\r\n system_prompt: str = None, # Prompt del sistema para el modelo\r\n mcp_servers: List = None, # Configuraciones de servidores MCP\r\n enable_logging: bool = False, # Habilitar logging del cliente\r\n enable_progress: bool = False, # Habilitar monitoreo de progreso\r\n log_level: str = \"INFO\" # Nivel de logging\r\n)\r\n```\r\n\r\n### ClientResult\r\n\r\n```python\r\n@dataclass\r\nclass ClientResult:\r\n output: str # Respuesta del modelo\r\n success: bool # Estado de ejecuci\u00f3n\r\n execution_id: str # Identificador \u00fanico de ejecuci\u00f3n\r\n timestamp: datetime # Marca de tiempo de ejecuci\u00f3n\r\n tools_used: List[str] # Lista de herramientas ejecutadas\r\n metadata: Dict[str, Any] # Metadatos de ejecuci\u00f3n\r\n raw_response: Any = None # Respuesta cruda del modelo\r\n error: str = None # Mensaje de error si fall\u00f3\r\n```\r\n\r\n### MCPServerConfig\r\n\r\n```python\r\n@dataclass\r\nclass MCPServerConfig:\r\n # Configuraci\u00f3n HTTP\r\n url: str = None\r\n headers: Dict[str, str] = None\r\n \r\n # Configuraci\u00f3n STDIO \r\n command: str = None\r\n args: List[str] = None\r\n env: Dict[str, str] = None\r\n cwd: str = None\r\n \r\n # Configuraci\u00f3n general\r\n transport_type: str = None # 'http', 'stdio', 'memory'\r\n timeout: float = 30.0\r\n keep_alive: bool = True\r\n```\r\n\r\n## Variables de Entorno\r\n\r\n```bash\r\n# Requerido\r\nDEEPSEEK_API_KEY=tu_clave_api_deepseek\r\n\r\n# Opcional\r\nMCP_LOG_LEVEL=INFO\r\nMCP_TIMEOUT=30\r\n```\r\n\r\n## Servidores MCP Compatibles\r\n\r\n### Servidores de Producci\u00f3n\r\n- **LaTeX MCP**: Generaci\u00f3n de documentos y creaci\u00f3n de PDF \r\n- **Database MCP**: Operaciones de base de datos SQL y an\u00e1lisis\r\n- **Weather MCP**: Datos meteorol\u00f3gicos y pron\u00f3sticos\r\n- **Web Search MCP**: B\u00fasqueda en internet y recuperaci\u00f3n de contenido\r\n\r\n### Servidores de Desarrollo\r\n- **FastMCP**: Servidores en memoria para testing y desarrollo\r\n- **Local File MCP**: Operaciones del sistema de archivos\r\n- **Calculator MCP**: Operaciones matem\u00e1ticas\r\n\r\n## Servidores MCP Que Te Podr\u00edan Interesar\r\n\r\n- **[MCP SQL](https://github.com/CarlosMaroRuiz/mcp_sql)**: Operaciones avanzadas de base de datos SQL con aprendizaje de consultas, an\u00e1lisis de esquemas y caracter\u00edsticas de optimizaci\u00f3n de rendimiento\r\n\r\n---\r\n\r\n# DeepSeek MCP Client\r\n\r\nA Python client for connecting DeepSeek language models with Model Context Protocol (MCP) servers, enabling seamless integration between AI models and external tools.\r\n\r\n## Features\r\n\r\n- **Multiple Transport Support**: HTTP/HTTPS, STDIO, and in-memory connections\r\n- **Flexible Configuration**: Support for multiple MCP servers simultaneously \r\n- **Progress Monitoring**: Real-time tracking of long-running operations\r\n- **Automatic Tool Discovery**: Dynamic detection and usage of available tools\r\n- **Configurable Logging**: Silent by default, with optional detailed logging\r\n- **Error Recovery**: Robust error handling and connection management\r\n- **Modular Architecture**: Clean, maintainable codebase with separated concerns\r\n\r\n## Installation\r\n\r\n### From PyPI (Recommended)\r\n\r\n```bash\r\npip install deepseek-mcp-client\r\n```\r\n\r\n### From Source\r\n\r\n```bash\r\ngit clone https://github.com/CarlosMaroRuiz/deepseek-mcp-client.git\r\ncd deepseek-mcp-client\r\npip install -e .\r\n```\r\n\r\n## Quick Start\r\n\r\n### 1. Set up your API key\r\n\r\n```bash\r\nexport DEEPSEEK_API_KEY=\"your_api_key_here\"\r\n```\r\n\r\n### 2. Basic usage\r\n\r\n```python\r\nfrom deepseek_mcp_client import DeepSeekClient\r\nimport asyncio\r\n\r\n# Simple client without MCP servers\r\nclient = DeepSeekClient(model='deepseek-chat')\r\n\r\nasync def main():\r\n result = await client.execute('Hello, how are you?')\r\n print(result.output)\r\n await client.close()\r\n\r\nasyncio.run(main())\r\n```\r\n\r\n## Configuration\r\n\r\n### HTTP MCP Servers\r\n\r\n```python\r\nfrom deepseek_mcp_client import DeepSeekClient\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='You are a helpful assistant with access to external tools.',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # Simple URL\r\n {\r\n 'url': 'http://localhost:8001/mcp/',\r\n 'headers': {'Authorization': 'Bearer your-token'},\r\n 'timeout': 30.0\r\n }\r\n ]\r\n)\r\n```\r\n\r\n### STDIO MCP Servers\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[\r\n {\r\n 'command': 'python',\r\n 'args': ['mcp_server.py'],\r\n 'env': {'LOG_LEVEL': 'INFO'},\r\n 'cwd': './servers/',\r\n 'timeout': 60.0\r\n }\r\n ]\r\n)\r\n```\r\n\r\n### Mixed Configuration\r\n\r\n```python\r\nfrom fastmcp import FastMCP\r\n\r\n# In-memory server\r\ncalculator = FastMCP(\"Calculator\")\r\n\r\n@calculator.tool\r\ndef add(a: float, b: float) -> float:\r\n \"\"\"Add two numbers\"\"\"\r\n return a + b\r\n\r\n# Mixed transport types\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # HTTP\r\n { # STDIO\r\n 'command': 'python',\r\n 'args': ['local_server.py']\r\n },\r\n calculator # In-memory\r\n ]\r\n)\r\n```\r\n\r\n## Logging Control\r\n\r\n### Silent Operation (Default)\r\n\r\n```python\r\n# Completely silent - no logs\r\nclient = DeepSeekClient(model='deepseek-chat')\r\n```\r\n\r\n### Enable Logging\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n enable_logging=True, # Show client operations\r\n enable_progress=True, # Show progress updates\r\n log_level=\"INFO\" # Set logging level\r\n)\r\n```\r\n\r\n### Colored Logging\r\n\r\n```python\r\nfrom deepseek_mcp_client import setup_colored_logging\r\n\r\n# Enable colored output\r\nsetup_colored_logging(\"INFO\")\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n enable_logging=True\r\n)\r\n```\r\n\r\n## Advanced Usage\r\n\r\n### Custom Server Configuration\r\n\r\n```python\r\nfrom deepseek_mcp_client import MCPServerConfig\r\n\r\n# Detailed HTTP server configuration\r\nhttp_config = MCPServerConfig(\r\n url='http://localhost:8000/mcp/',\r\n headers={'X-API-Key': 'your-key'},\r\n timeout=45.0,\r\n transport_type='http'\r\n)\r\n\r\n# Detailed STDIO server configuration\r\nstdio_config = MCPServerConfig(\r\n command='node',\r\n args=['server.js', '--port', '3000'],\r\n env={'NODE_ENV': 'production'},\r\n cwd='./node-server/',\r\n timeout=30.0\r\n)\r\n\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=[http_config, stdio_config]\r\n)\r\n```\r\n\r\n### Error Handling\r\n\r\n```python\r\nasync def robust_execution():\r\n client = DeepSeekClient(\r\n model='deepseek-chat',\r\n mcp_servers=['http://localhost:8000/mcp/']\r\n )\r\n \r\n try:\r\n result = await client.execute('Your query here')\r\n \r\n if result.success:\r\n print(f\"Success: {result.output}\")\r\n print(f\"Tools used: {result.tools_used}\")\r\n print(f\"Duration: {result.metadata.get('duration'):.2f}s\")\r\n else:\r\n print(f\"Error: {result.error}\")\r\n \r\n except Exception as e:\r\n print(f\"Critical error: {e}\")\r\n finally:\r\n await client.close()\r\n```\r\n\r\n### Working with Results\r\n\r\n```python\r\nresult = await client.execute('Analyze the database')\r\n\r\n# Check execution status\r\nif result.success:\r\n print(\"Execution completed successfully\")\r\n \r\n# Access response data\r\nprint(f\"Response: {result.output}\")\r\nprint(f\"Execution ID: {result.execution_id}\")\r\nprint(f\"Tools used: {result.tools_used}\")\r\n\r\n# Access metadata\r\nmetadata = result.metadata\r\nprint(f\"Duration: {metadata.get('duration')}s\")\r\nprint(f\"Servers connected: {metadata.get('servers_connected')}\")\r\nprint(f\"Tools available: {metadata.get('tools_available')}\")\r\n\r\n# Convert to dictionary for serialization\r\nresult_dict = result.to_dict()\r\n```\r\n\r\n## Common Use Cases\r\n\r\n### Database Analysis\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='You are a database analyst with SQL expertise.',\r\n mcp_servers=['http://localhost:8000/mcp/']\r\n)\r\n\r\nresult = await client.execute('''\r\nAnalyze the user_orders table structure and provide insights on:\r\n1. Table schema and relationships\r\n2. Data distribution patterns\r\n3. Performance optimization suggestions\r\n''')\r\n```\r\n\r\n### E-commerce Integration\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='You are an e-commerce assistant.',\r\n mcp_servers=[\r\n 'http://localhost:8000/mcp/', # Product search MCP\r\n 'http://localhost:8001/mcp/' # Price comparison MCP\r\n ]\r\n)\r\n\r\nresult = await client.execute(\r\n 'Find the best gaming laptops under $1500 and compare prices across platforms'\r\n)\r\n```\r\n\r\n### Document Generation\r\n\r\n```python\r\nclient = DeepSeekClient(\r\n model='deepseek-chat',\r\n system_prompt='You are a document generation specialist.',\r\n mcp_servers=[\r\n {\r\n 'command': 'python',\r\n 'args': ['latex_server.py'],\r\n 'env': {'DOCUMENT_TEMPLATE': 'professional'}\r\n }\r\n ]\r\n)\r\n\r\nresult = await client.execute('''\r\nGenerate a technical report on our Q4 performance with:\r\n- Executive summary\r\n- Performance metrics charts\r\n- Recommendations section\r\nExport as PDF format\r\n''')\r\n```\r\n\r\n## API Reference\r\n\r\n### DeepSeekClient\r\n\r\n```python\r\nDeepSeekClient(\r\n model: str, # DeepSeek model name (required)\r\n system_prompt: str = None, # System prompt for the model\r\n mcp_servers: List = None, # MCP server configurations\r\n enable_logging: bool = False, # Enable client logging\r\n enable_progress: bool = False, # Enable progress monitoring\r\n log_level: str = \"INFO\" # Logging level\r\n)\r\n```\r\n\r\n### ClientResult\r\n\r\n```python\r\n@dataclass\r\nclass ClientResult:\r\n output: str # Model response\r\n success: bool # Execution status\r\n execution_id: str # Unique execution identifier\r\n timestamp: datetime # Execution timestamp\r\n tools_used: List[str] # List of tools executed\r\n metadata: Dict[str, Any] # Execution metadata\r\n raw_response: Any = None # Raw model response\r\n error: str = None # Error message if failed\r\n```\r\n\r\n### MCPServerConfig\r\n\r\n```python\r\n@dataclass\r\nclass MCPServerConfig:\r\n # HTTP configuration\r\n url: str = None\r\n headers: Dict[str, str] = None\r\n \r\n # STDIO configuration \r\n command: str = None\r\n args: List[str] = None\r\n env: Dict[str, str] = None\r\n cwd: str = None\r\n \r\n # General configuration\r\n transport_type: str = None # 'http', 'stdio', 'memory'\r\n timeout: float = 30.0\r\n keep_alive: bool = True\r\n```\r\n\r\n## Environment Variables\r\n\r\n```bash\r\n# Required\r\nDEEPSEEK_API_KEY=your_deepseek_api_key\r\n\r\n# Optional\r\nMCP_LOG_LEVEL=INFO\r\nMCP_TIMEOUT=30\r\n```\r\n\r\n## Compatible MCP Servers\r\n\r\n### Production Servers\r\n- **LaTeX MCP**: Document generation and PDF creation \r\n- **Database MCP**: SQL database operations and analysis\r\n- **Weather MCP**: Weather data and forecasting\r\n- **Web Search MCP**: Internet search and content retrieval\r\n\r\n### Development Servers\r\n- **FastMCP**: In-memory servers for testing and development\r\n- **Local File MCP**: File system operations\r\n- **Calculator MCP**: Mathematical operations\r\n\r\n## MCP Servers That Might Interest You\r\n\r\n- **[MCP SQL](https://github.com/CarlosMaroRuiz/mcp_sql)**: Advanced SQL database operations with query learning, schema analysis, and performance optimization features\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Cliente para conectar modelos DeepSeek con servidores MCP",
"version": "2.0.0",
"project_urls": {
"Bug Tracker": "https://github.com/CarlosMaroRuiz/deepseek-mcp-client/issues",
"Documentation": "https://github.com/CarlosMaroRuiz/deepseek-mcp-client#readme",
"Homepage": "https://github.com/CarlosMaroRuiz/deepseek-mcp-client",
"Repository": "https://github.com/CarlosMaroRuiz/deepseek-mcp-client"
},
"split_keywords": [
"deepseek",
" mcp",
" client",
" ai",
" llm",
" model context protocol",
" tools",
" agent"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "849dfa29b7e587a89c732d5307494fd8a7b4d4815bd4357087975920bed8f89c",
"md5": "d32857b5e3ed6162ad9b3d9285cc9c56",
"sha256": "f9cff8b7ad029681ac029d3b7aa706660d5f230579e2eaac7314fed14a636a43"
},
"downloads": -1,
"filename": "deepseek_mcp_client-2.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d32857b5e3ed6162ad9b3d9285cc9c56",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 21015,
"upload_time": "2025-08-22T05:24:16",
"upload_time_iso_8601": "2025-08-22T05:24:16.997385Z",
"url": "https://files.pythonhosted.org/packages/84/9d/fa29b7e587a89c732d5307494fd8a7b4d4815bd4357087975920bed8f89c/deepseek_mcp_client-2.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "556cd5380e2198c738c1981fe1ce1271a76926cc417432caba09e16b1fa337b7",
"md5": "d5b1ee8976662f9bddf6d01bd3d9443e",
"sha256": "b6c7993024cccd89008d257eb44a1808363aa35fc8f36931e4459c93346a7539"
},
"downloads": -1,
"filename": "deepseek_mcp_client-2.0.0.tar.gz",
"has_sig": false,
"md5_digest": "d5b1ee8976662f9bddf6d01bd3d9443e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 22863,
"upload_time": "2025-08-22T05:24:19",
"upload_time_iso_8601": "2025-08-22T05:24:19.995831Z",
"url": "https://files.pythonhosted.org/packages/55/6c/d5380e2198c738c1981fe1ce1271a76926cc417432caba09e16b1fa337b7/deepseek_mcp_client-2.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-22 05:24:19",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "CarlosMaroRuiz",
"github_project": "deepseek-mcp-client",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "python-dotenv",
"specs": [
[
">=",
"1.0.0"
]
]
},
{
"name": "requests",
"specs": [
[
">=",
"2.31.0"
]
]
},
{
"name": "mcp",
"specs": [
[
">=",
"1.0.0"
]
]
},
{
"name": "uvicorn",
"specs": [
[
">=",
"0.32.1"
]
]
},
{
"name": "openai",
"specs": [
[
">=",
"1.67.0"
]
]
},
{
"name": "httpx",
"specs": [
[
">=",
"0.24.0"
]
]
},
{
"name": "fastmcp",
"specs": [
[
">=",
"0.1.0"
]
]
}
],
"lcname": "deepseek-mcp-client"
}