metorial-deepseek


Namemetorial-deepseek JSON
Version 1.0.0rc2 PyPI version JSON
download
home_pageNone
SummaryDeepSeek provider for Metorial
upload_time2025-07-26 12:17:46
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords ai deepseek llm metorial
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # metorial-deepseek

DeepSeek provider integration for Metorial - enables using Metorial tools with DeepSeek's language models through OpenAI-compatible function calling.

## Installation

```bash
pip install metorial-deepseek
# or
uv add metorial-deepseek
# or
poetry add metorial-deepseek
```

## Features

- 🤖 **DeepSeek Integration**: Full support for DeepSeek Chat, DeepSeek Coder, and other models
- 🛠️ **Function Calling**: OpenAI-compatible function calling support
- 📡 **Session Management**: Automatic tool lifecycle handling
- 🔄 **Format Conversion**: Converts Metorial tools to OpenAI function format
- ⚡ **Async Support**: Full async/await support

## Usage

### Basic Usage

```python
import asyncio
from openai import OpenAI
from metorial import Metorial
from metorial_deepseek import MetorialDeepSeekSession

async def main():
    # Initialize clients
    metorial = Metorial(api_key="your-metorial-api-key")
    
    # DeepSeek uses OpenAI-compatible client
    deepseek_client = OpenAI(
        api_key="your-deepseek-api-key",
        base_url="https://api.deepseek.com"
    )
    
    # Create session with your server deployments
    async with metorial.session(["your-server-deployment-id"]) as session:
        # Create DeepSeek-specific wrapper
        deepseek_session = MetorialDeepSeekSession(session.tool_manager)
        
        messages = [
            {"role": "user", "content": "Help me analyze this code"}
        ]
        
        response = deepseek_client.chat.completions.create(
            model="deepseek-chat",
            messages=messages,
            tools=deepseek_session.tools
        )
        
        # Handle tool calls
        tool_calls = response.choices[0].message.tool_calls
        if tool_calls:
            tool_responses = await deepseek_session.call_tools(tool_calls)
            
            # Add to conversation
            messages.append({
                "role": "assistant",
                "tool_calls": tool_calls
            })
            messages.extend(tool_responses)
            
            # Continue conversation...

asyncio.run(main())
```

### Using Convenience Functions

```python
from metorial_deepseek import build_deepseek_tools, call_deepseek_tools

async def example_with_functions():
    # Get tools in DeepSeek format
    tools = build_deepseek_tools(tool_manager)
    
    # Call tools from DeepSeek response
    tool_messages = await call_deepseek_tools(tool_manager, tool_calls)
```

## API Reference

### `MetorialDeepSeekSession`

Main session class for DeepSeek integration.

```python
session = MetorialDeepSeekSession(tool_manager)
```

**Properties:**
- `tools`: List of tools in OpenAI-compatible format

**Methods:**
- `async call_tools(tool_calls)`: Execute tool calls and return tool messages

### `build_deepseek_tools(tool_mgr)`

Build DeepSeek-compatible tool definitions.

**Returns:** List of tool definitions in OpenAI format

### `call_deepseek_tools(tool_mgr, tool_calls)`

Execute tool calls from DeepSeek response.

**Returns:** List of tool messages

## Tool Format

Tools are converted to OpenAI-compatible format (without strict mode):

```python
{
    "type": "function",
    "function": {
        "name": "tool_name",
        "description": "Tool description",
        "parameters": {
            "type": "object",
            "properties": {...},
            "required": [...]
        }
    }
}
```

## DeepSeek API Configuration

DeepSeek uses the OpenAI-compatible API format. Configure your client like this:

```python
from openai import OpenAI

client = OpenAI(
    api_key="your-deepseek-api-key",
    base_url="https://api.deepseek.com"
)
```

## Supported Models

- `deepseek-chat`: General-purpose conversational model
- `deepseek-coder`: Specialized for code-related tasks

## Error Handling

```python
try:
    tool_messages = await deepseek_session.call_tools(tool_calls)
except Exception as e:
    print(f"Tool execution failed: {e}")
```

Tool errors are returned as tool messages with error content.

## Dependencies

- `metorial-openai-compatible>=1.0.0`
- `metorial-mcp-session>=1.0.0`
- `typing-extensions>=4.0.0`

## License

MIT License - see [LICENSE](../../LICENSE) file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "metorial-deepseek",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "ai, deepseek, llm, metorial",
    "author": null,
    "author_email": "Metorial Team <support@metorial.com>",
    "download_url": "https://files.pythonhosted.org/packages/01/26/33e01f54ea65744459a57cd5bbfe6b1bde4a9bce7abcd7c1ac42020dbee3/metorial_deepseek-1.0.0rc2.tar.gz",
    "platform": null,
    "description": "# metorial-deepseek\n\nDeepSeek provider integration for Metorial - enables using Metorial tools with DeepSeek's language models through OpenAI-compatible function calling.\n\n## Installation\n\n```bash\npip install metorial-deepseek\n# or\nuv add metorial-deepseek\n# or\npoetry add metorial-deepseek\n```\n\n## Features\n\n- \ud83e\udd16 **DeepSeek Integration**: Full support for DeepSeek Chat, DeepSeek Coder, and other models\n- \ud83d\udee0\ufe0f **Function Calling**: OpenAI-compatible function calling support\n- \ud83d\udce1 **Session Management**: Automatic tool lifecycle handling\n- \ud83d\udd04 **Format Conversion**: Converts Metorial tools to OpenAI function format\n- \u26a1 **Async Support**: Full async/await support\n\n## Usage\n\n### Basic Usage\n\n```python\nimport asyncio\nfrom openai import OpenAI\nfrom metorial import Metorial\nfrom metorial_deepseek import MetorialDeepSeekSession\n\nasync def main():\n    # Initialize clients\n    metorial = Metorial(api_key=\"your-metorial-api-key\")\n    \n    # DeepSeek uses OpenAI-compatible client\n    deepseek_client = OpenAI(\n        api_key=\"your-deepseek-api-key\",\n        base_url=\"https://api.deepseek.com\"\n    )\n    \n    # Create session with your server deployments\n    async with metorial.session([\"your-server-deployment-id\"]) as session:\n        # Create DeepSeek-specific wrapper\n        deepseek_session = MetorialDeepSeekSession(session.tool_manager)\n        \n        messages = [\n            {\"role\": \"user\", \"content\": \"Help me analyze this code\"}\n        ]\n        \n        response = deepseek_client.chat.completions.create(\n            model=\"deepseek-chat\",\n            messages=messages,\n            tools=deepseek_session.tools\n        )\n        \n        # Handle tool calls\n        tool_calls = response.choices[0].message.tool_calls\n        if tool_calls:\n            tool_responses = await deepseek_session.call_tools(tool_calls)\n            \n            # Add to conversation\n            messages.append({\n                \"role\": \"assistant\",\n                \"tool_calls\": tool_calls\n            })\n            messages.extend(tool_responses)\n            \n            # Continue conversation...\n\nasyncio.run(main())\n```\n\n### Using Convenience Functions\n\n```python\nfrom metorial_deepseek import build_deepseek_tools, call_deepseek_tools\n\nasync def example_with_functions():\n    # Get tools in DeepSeek format\n    tools = build_deepseek_tools(tool_manager)\n    \n    # Call tools from DeepSeek response\n    tool_messages = await call_deepseek_tools(tool_manager, tool_calls)\n```\n\n## API Reference\n\n### `MetorialDeepSeekSession`\n\nMain session class for DeepSeek integration.\n\n```python\nsession = MetorialDeepSeekSession(tool_manager)\n```\n\n**Properties:**\n- `tools`: List of tools in OpenAI-compatible format\n\n**Methods:**\n- `async call_tools(tool_calls)`: Execute tool calls and return tool messages\n\n### `build_deepseek_tools(tool_mgr)`\n\nBuild DeepSeek-compatible tool definitions.\n\n**Returns:** List of tool definitions in OpenAI format\n\n### `call_deepseek_tools(tool_mgr, tool_calls)`\n\nExecute tool calls from DeepSeek response.\n\n**Returns:** List of tool messages\n\n## Tool Format\n\nTools are converted to OpenAI-compatible format (without strict mode):\n\n```python\n{\n    \"type\": \"function\",\n    \"function\": {\n        \"name\": \"tool_name\",\n        \"description\": \"Tool description\",\n        \"parameters\": {\n            \"type\": \"object\",\n            \"properties\": {...},\n            \"required\": [...]\n        }\n    }\n}\n```\n\n## DeepSeek API Configuration\n\nDeepSeek uses the OpenAI-compatible API format. Configure your client like this:\n\n```python\nfrom openai import OpenAI\n\nclient = OpenAI(\n    api_key=\"your-deepseek-api-key\",\n    base_url=\"https://api.deepseek.com\"\n)\n```\n\n## Supported Models\n\n- `deepseek-chat`: General-purpose conversational model\n- `deepseek-coder`: Specialized for code-related tasks\n\n## Error Handling\n\n```python\ntry:\n    tool_messages = await deepseek_session.call_tools(tool_calls)\nexcept Exception as e:\n    print(f\"Tool execution failed: {e}\")\n```\n\nTool errors are returned as tool messages with error content.\n\n## Dependencies\n\n- `metorial-openai-compatible>=1.0.0`\n- `metorial-mcp-session>=1.0.0`\n- `typing-extensions>=4.0.0`\n\n## License\n\nMIT License - see [LICENSE](../../LICENSE) file for details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "DeepSeek provider for Metorial",
    "version": "1.0.0rc2",
    "project_urls": {
        "Documentation": "https://metorial.com/docs",
        "Homepage": "https://metorial.com",
        "Repository": "https://github.com/metorial/metorial-enterprise"
    },
    "split_keywords": [
        "ai",
        " deepseek",
        " llm",
        " metorial"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "758917f0d6d53429c977c0ae48b71b640f9e8a051326f76938157b587c94ae37",
                "md5": "ea48b4e049332caffffacd8eee465347",
                "sha256": "cb277f4ec86bc0e093e74b91f5e73157b46f304fa1144f98364c29b431de9ef1"
            },
            "downloads": -1,
            "filename": "metorial_deepseek-1.0.0rc2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ea48b4e049332caffffacd8eee465347",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 4377,
            "upload_time": "2025-07-26T12:17:43",
            "upload_time_iso_8601": "2025-07-26T12:17:43.523052Z",
            "url": "https://files.pythonhosted.org/packages/75/89/17f0d6d53429c977c0ae48b71b640f9e8a051326f76938157b587c94ae37/metorial_deepseek-1.0.0rc2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "012633e01f54ea65744459a57cd5bbfe6b1bde4a9bce7abcd7c1ac42020dbee3",
                "md5": "88c04862d69692bf61947bb1531db80d",
                "sha256": "83e4eb8f4a2775424f998655b69621d97b135134c49fdbf5b8c7fefe0a712770"
            },
            "downloads": -1,
            "filename": "metorial_deepseek-1.0.0rc2.tar.gz",
            "has_sig": false,
            "md5_digest": "88c04862d69692bf61947bb1531db80d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 5400,
            "upload_time": "2025-07-26T12:17:46",
            "upload_time_iso_8601": "2025-07-26T12:17:46.700998Z",
            "url": "https://files.pythonhosted.org/packages/01/26/33e01f54ea65744459a57cd5bbfe6b1bde4a9bce7abcd7c1ac42020dbee3/metorial_deepseek-1.0.0rc2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-26 12:17:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "metorial",
    "github_project": "metorial-enterprise",
    "github_not_found": true,
    "lcname": "metorial-deepseek"
}
        
Elapsed time: 1.75011s