mcp


Namemcp JSON
Version 1.1.0 PyPI version JSON
download
home_pageNone
SummaryModel Context Protocol SDK
upload_time2024-12-03 22:39:19
maintainerNone
docs_urlNone
authorAnthropic, PBC.
requires_python>=3.10
licenseMIT
keywords automation git llm mcp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MCP Python SDK
[![PyPI][pypi-badge]][pypi-url]
[![MIT licensed][mit-badge]][mit-url]
[![Python Version][python-badge]][python-url]
[![Documentation][docs-badge]][docs-url]
[![Specification][spec-badge]][spec-url]
[![GitHub Discussions][discussions-badge]][discussions-url]

[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
[pypi-url]: https://pypi.org/project/mcp/
[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
[python-url]: https://www.python.org/downloads/
[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
[docs-url]: https://modelcontextprotocol.io
[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
[spec-url]: https://spec.modelcontextprotocol.io
[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions

Python implementation of the [Model Context Protocol](https://modelcontextprotocol.io) (MCP), providing both client and server capabilities for integrating with LLM surfaces.

## Overview

The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:

- Build MCP clients that can connect to any MCP server
- Create MCP servers that expose resources, prompts and tools
- Use standard transports like stdio and SSE
- Handle all MCP protocol messages and lifecycle events

## Installation

We recommend the use of [uv](https://docs.astral.sh/uv/) to manage your Python projects:

```bash
uv add mcp
```

Alternatively, add mcp to your `requirements.txt`:
```
pip install mcp
# or add to requirements.txt
pip install -r requirements.txt
```

## Overview
MCP servers provide focused functionality like resources, tools, prompts, and other capabilities that can be reused across many client applications. These servers are designed to be easy to build, highly composable, and modular.

### Key design principles
- Servers are extremely easy to build with clear, simple interfaces
- Multiple servers can be composed seamlessly through a shared protocol
- Each server operates in isolation and cannot access conversation context
- Features can be added progressively through capability negotiation

### Server provided primitives
- [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts): Templatable text
- [Resources](https://modelcontextprotocol.io/docs/concepts/resources): File-like attachments
- [Tools](https://modelcontextprotocol.io/docs/concepts/tools): Functions that models can call
- Utilities:
  - Completion: Auto-completion provider for prompt arguments or resource URI templates
  - Logging: Logging to the client
  - Pagination*: Pagination for long results

### Client provided primitives
 - [Sampling](https://modelcontextprotocol.io/docs/concepts/sampling): Allow servers to sample using client models
 - Roots: Information about locations to operate on (e.g., directories)

Connections between clients and servers are established through transports like **stdio** or **SSE** (Note that most clients support stdio, but not SSE at the moment). The transport layer handles message framing, delivery, and error handling.

## Quick Start

### Creating a Server

MCP servers follow a decorator approach to register handlers for MCP primitives like resources, prompts, and tools. The goal is to provide a simple interface for exposing capabilities to LLM clients.

**example_server.py**

```python
# /// script
# dependencies = [
#   "mcp"
# ]
# ///
from mcp.server import Server, NotificationOptions
from mcp.server.models import InitializationOptions
import mcp.server.stdio
import mcp.types as types

# Create a server instance
server = Server("example-server")

# Add prompt capabilities
@server.list_prompts()
async def handle_list_prompts() -> list[types.Prompt]:
    return [
        types.Prompt(
            name="example-prompt",
            description="An example prompt template",
            arguments=[
                types.PromptArgument(
                    name="arg1",
                    description="Example argument",
                    required=True
                )
            ]
        )
    ]

@server.get_prompt()
async def handle_get_prompt(
    name: str,
    arguments: dict[str, str] | None
) -> types.GetPromptResult:
    if name != "example-prompt":
        raise ValueError(f"Unknown prompt: {name}")

    return types.GetPromptResult(
        description="Example prompt",
        messages=[
            types.PromptMessage(
                role="user",
                content=types.TextContent(
                    type="text",
                    text="Example prompt text"
                )
            )
        ]
    )

async def run():
    # Run the server as STDIO
    async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
        await server.run(
            read_stream,
            write_stream,
            InitializationOptions(
                server_name="example",
                server_version="0.1.0",
                capabilities=server.get_capabilities(
                    notification_options=NotificationOptions(),
                    experimental_capabilities={},
                )
            )
        )

if __name__ == "__main__":
    import asyncio
    asyncio.run(run())
```

### Creating a Client

**example_client.py**

```python
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

# Create server parameters for stdio connection
server_params = StdioServerParameters(
    command="python", # Executable
    args=["example_server.py"], # Optional command line arguments
    env=None # Optional environment variables
)

async def run():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the connection
            await session.initialize()

            # The example server only supports prompt primitives:
        
            # List available prompts
            prompts = await session.list_prompts()

            # Get a prompt
            prompt = await session.get_prompt("example-prompt", arguments={"arg1": "value"})

            """
            Other example calls include:

            # List available resources
            resources = await session.list_resources()

            # List available tools
            tools = await session.list_tools()

            # Read a resource
            resource = await session.read_resource("file://some/path")

            # Call a tool
            result = await session.call_tool("tool-name", arguments={"arg1": "value"})
            """

if __name__ == "__main__":
    import asyncio
    asyncio.run(run())
```

## Primitives

The MCP Python SDK provides decorators that map to the core protocol primitives. Each primitive follows a different interaction pattern based on how it is controlled and used:

| Primitive | Control               | Description                                         | Example Use                  |
|-----------|-----------------------|-----------------------------------------------------|------------------------------|
| Prompts   | User-controlled       | Interactive templates invoked by user choice        | Slash commands, menu options |
| Resources | Application-controlled| Contextual data managed by the client application   | File contents, API responses |
| Tools     | Model-controlled      | Functions exposed to the LLM to take actions        | API calls, data updates      |

### User-Controlled Primitives

**Prompts** are designed to be explicitly selected by users for their interactions with LLMs.

| Decorator                | Description                            |
|--------------------------|----------------------------------------|
| `@server.list_prompts()` | List available prompt templates        |
| `@server.get_prompt()`   | Get a specific prompt with arguments   |

### Application-Controlled Primitives

**Resources** are controlled by the client application, which decides how and when they should be used based on its own logic.

| Decorator                      | Description                           |
|--------------------------------|---------------------------------------|
| `@server.list_resources()`     | List available resources              |
| `@server.read_resource()`      | Read a specific resource's content    |
| `@server.subscribe_resource()` | Subscribe to resource updates         |

### Model-Controlled Primitives

**Tools** are exposed to LLMs to enable automated actions, with user approval.

| Decorator              | Description                        |
|------------------------|------------------------------------|
| `@server.list_tools()` | List available tools               |
| `@server.call_tool()`  | Execute a tool with arguments      |

### Server Management

Additional decorators for server functionality:

| Decorator                     | Description                    |
|-------------------------------|--------------------------------|
| `@server.set_logging_level()` | Update server logging level    |

### Capabilities

MCP servers declare capabilities during initialization. These map to specific decorators:

| Capability  | Feature Flag                 | Decorators                                                      | Description                        |
|-------------|------------------------------|-----------------------------------------------------------------|-------------------------------------|
| `prompts`   | `listChanged`                | `@list_prompts`<br/>`@get_prompt`                               | Prompt template management          |
| `resources` | `subscribe`<br/>`listChanged`| `@list_resources`<br/>`@read_resource`<br/>`@subscribe_resource`| Resource exposure and updates       |
| `tools`     | `listChanged`                | `@list_tools`<br/>`@call_tool`                                  | Tool discovery and execution        |
| `logging`   | -                            | `@set_logging_level`                                            | Server logging configuration        |
| `completion`| -                            | `@complete_argument`                                            | Argument completion suggestions     |

Capabilities are negotiated during connection initialization. Servers only need to implement the decorators for capabilities they support.

## Client Interaction

The MCP Python SDK enables servers to interact with clients through request context and session management. This allows servers to perform operations like LLM sampling and progress tracking.

### Request Context

The Request Context provides access to the current request and client session. It can be accessed through `server.request_context` and enables:

- Sampling from the client's LLM
- Sending progress updates
- Logging messages
- Accessing request metadata

Example using request context for LLM sampling:

```python
@server.call_tool()
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
    # Access the current request context
    context = server.request_context

    # Use the session to sample from the client's LLM
    result = await context.session.create_message(
        messages=[
            types.SamplingMessage(
                role="user",
                content=types.TextContent(
                    type="text",
                    text="Analyze this data: " + json.dumps(arguments)
                )
            )
        ],
        max_tokens=100
    )

    return [types.TextContent(type="text", text=result.content.text)]
```

Using request context for progress updates:

```python
@server.call_tool()
async def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:
    context = server.request_context

    if progress_token := context.meta.progressToken:
        # Send progress notifications
        await context.session.send_progress_notification(
            progress_token=progress_token,
            progress=0.5,
            total=1.0
        )

    # Perform operation...

    if progress_token:
        await context.session.send_progress_notification(
            progress_token=progress_token,
            progress=1.0,
            total=1.0
        )

    return [types.TextContent(type="text", text="Operation complete")]
```

The request context is automatically set for each request and provides a safe way to access the current client session and request metadata.

## Documentation

- [Model Context Protocol documentation](https://modelcontextprotocol.io)
- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
- [Officially supported servers](https://github.com/modelcontextprotocol/servers)

## Contributing

We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.

## License

This project is licensed under the MIT License - see the LICENSE file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mcp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "David Soria Parra <davidsp@anthropic.com>, Justin Spahr-Summers <justin@anthropic.com>",
    "keywords": "automation, git, llm, mcp",
    "author": "Anthropic, PBC.",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/77/f2/067b1fc114e8d3ae4af02fc4f4ed8971a2c4900362d976fabe0f4e9a3418/mcp-1.1.0.tar.gz",
    "platform": null,
    "description": "# MCP Python SDK\n[![PyPI][pypi-badge]][pypi-url]\n[![MIT licensed][mit-badge]][mit-url]\n[![Python Version][python-badge]][python-url]\n[![Documentation][docs-badge]][docs-url]\n[![Specification][spec-badge]][spec-url]\n[![GitHub Discussions][discussions-badge]][discussions-url]\n\n[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg\n[pypi-url]: https://pypi.org/project/mcp/\n[mit-badge]: https://img.shields.io/pypi/l/mcp.svg\n[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE\n[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg\n[python-url]: https://www.python.org/downloads/\n[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg\n[docs-url]: https://modelcontextprotocol.io\n[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg\n[spec-url]: https://spec.modelcontextprotocol.io\n[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk\n[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions\n\nPython implementation of the [Model Context Protocol](https://modelcontextprotocol.io) (MCP), providing both client and server capabilities for integrating with LLM surfaces.\n\n## Overview\n\nThe Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:\n\n- Build MCP clients that can connect to any MCP server\n- Create MCP servers that expose resources, prompts and tools\n- Use standard transports like stdio and SSE\n- Handle all MCP protocol messages and lifecycle events\n\n## Installation\n\nWe recommend the use of [uv](https://docs.astral.sh/uv/) to manage your Python projects:\n\n```bash\nuv add mcp\n```\n\nAlternatively, add mcp to your `requirements.txt`:\n```\npip install mcp\n# or add to requirements.txt\npip install -r requirements.txt\n```\n\n## Overview\nMCP servers provide focused functionality like resources, tools, prompts, and other capabilities that can be reused across many client applications. These servers are designed to be easy to build, highly composable, and modular.\n\n### Key design principles\n- Servers are extremely easy to build with clear, simple interfaces\n- Multiple servers can be composed seamlessly through a shared protocol\n- Each server operates in isolation and cannot access conversation context\n- Features can be added progressively through capability negotiation\n\n### Server provided primitives\n- [Prompts](https://modelcontextprotocol.io/docs/concepts/prompts): Templatable text\n- [Resources](https://modelcontextprotocol.io/docs/concepts/resources): File-like attachments\n- [Tools](https://modelcontextprotocol.io/docs/concepts/tools): Functions that models can call\n- Utilities:\n  - Completion: Auto-completion provider for prompt arguments or resource URI templates\n  - Logging: Logging to the client\n  - Pagination*: Pagination for long results\n\n### Client provided primitives\n - [Sampling](https://modelcontextprotocol.io/docs/concepts/sampling): Allow servers to sample using client models\n - Roots: Information about locations to operate on (e.g., directories)\n\nConnections between clients and servers are established through transports like **stdio** or **SSE** (Note that most clients support stdio, but not SSE at the moment). The transport layer handles message framing, delivery, and error handling.\n\n## Quick Start\n\n### Creating a Server\n\nMCP servers follow a decorator approach to register handlers for MCP primitives like resources, prompts, and tools. The goal is to provide a simple interface for exposing capabilities to LLM clients.\n\n**example_server.py**\n\n```python\n# /// script\n# dependencies = [\n#   \"mcp\"\n# ]\n# ///\nfrom mcp.server import Server, NotificationOptions\nfrom mcp.server.models import InitializationOptions\nimport mcp.server.stdio\nimport mcp.types as types\n\n# Create a server instance\nserver = Server(\"example-server\")\n\n# Add prompt capabilities\n@server.list_prompts()\nasync def handle_list_prompts() -> list[types.Prompt]:\n    return [\n        types.Prompt(\n            name=\"example-prompt\",\n            description=\"An example prompt template\",\n            arguments=[\n                types.PromptArgument(\n                    name=\"arg1\",\n                    description=\"Example argument\",\n                    required=True\n                )\n            ]\n        )\n    ]\n\n@server.get_prompt()\nasync def handle_get_prompt(\n    name: str,\n    arguments: dict[str, str] | None\n) -> types.GetPromptResult:\n    if name != \"example-prompt\":\n        raise ValueError(f\"Unknown prompt: {name}\")\n\n    return types.GetPromptResult(\n        description=\"Example prompt\",\n        messages=[\n            types.PromptMessage(\n                role=\"user\",\n                content=types.TextContent(\n                    type=\"text\",\n                    text=\"Example prompt text\"\n                )\n            )\n        ]\n    )\n\nasync def run():\n    # Run the server as STDIO\n    async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):\n        await server.run(\n            read_stream,\n            write_stream,\n            InitializationOptions(\n                server_name=\"example\",\n                server_version=\"0.1.0\",\n                capabilities=server.get_capabilities(\n                    notification_options=NotificationOptions(),\n                    experimental_capabilities={},\n                )\n            )\n        )\n\nif __name__ == \"__main__\":\n    import asyncio\n    asyncio.run(run())\n```\n\n### Creating a Client\n\n**example_client.py**\n\n```python\nfrom mcp import ClientSession, StdioServerParameters\nfrom mcp.client.stdio import stdio_client\n\n# Create server parameters for stdio connection\nserver_params = StdioServerParameters(\n    command=\"python\", # Executable\n    args=[\"example_server.py\"], # Optional command line arguments\n    env=None # Optional environment variables\n)\n\nasync def run():\n    async with stdio_client(server_params) as (read, write):\n        async with ClientSession(read, write) as session:\n            # Initialize the connection\n            await session.initialize()\n\n            # The example server only supports prompt primitives:\n        \n            # List available prompts\n            prompts = await session.list_prompts()\n\n            # Get a prompt\n            prompt = await session.get_prompt(\"example-prompt\", arguments={\"arg1\": \"value\"})\n\n            \"\"\"\n            Other example calls include:\n\n            # List available resources\n            resources = await session.list_resources()\n\n            # List available tools\n            tools = await session.list_tools()\n\n            # Read a resource\n            resource = await session.read_resource(\"file://some/path\")\n\n            # Call a tool\n            result = await session.call_tool(\"tool-name\", arguments={\"arg1\": \"value\"})\n            \"\"\"\n\nif __name__ == \"__main__\":\n    import asyncio\n    asyncio.run(run())\n```\n\n## Primitives\n\nThe MCP Python SDK provides decorators that map to the core protocol primitives. Each primitive follows a different interaction pattern based on how it is controlled and used:\n\n| Primitive | Control               | Description                                         | Example Use                  |\n|-----------|-----------------------|-----------------------------------------------------|------------------------------|\n| Prompts   | User-controlled       | Interactive templates invoked by user choice        | Slash commands, menu options |\n| Resources | Application-controlled| Contextual data managed by the client application   | File contents, API responses |\n| Tools     | Model-controlled      | Functions exposed to the LLM to take actions        | API calls, data updates      |\n\n### User-Controlled Primitives\n\n**Prompts** are designed to be explicitly selected by users for their interactions with LLMs.\n\n| Decorator                | Description                            |\n|--------------------------|----------------------------------------|\n| `@server.list_prompts()` | List available prompt templates        |\n| `@server.get_prompt()`   | Get a specific prompt with arguments   |\n\n### Application-Controlled Primitives\n\n**Resources** are controlled by the client application, which decides how and when they should be used based on its own logic.\n\n| Decorator                      | Description                           |\n|--------------------------------|---------------------------------------|\n| `@server.list_resources()`     | List available resources              |\n| `@server.read_resource()`      | Read a specific resource's content    |\n| `@server.subscribe_resource()` | Subscribe to resource updates         |\n\n### Model-Controlled Primitives\n\n**Tools** are exposed to LLMs to enable automated actions, with user approval.\n\n| Decorator              | Description                        |\n|------------------------|------------------------------------|\n| `@server.list_tools()` | List available tools               |\n| `@server.call_tool()`  | Execute a tool with arguments      |\n\n### Server Management\n\nAdditional decorators for server functionality:\n\n| Decorator                     | Description                    |\n|-------------------------------|--------------------------------|\n| `@server.set_logging_level()` | Update server logging level    |\n\n### Capabilities\n\nMCP servers declare capabilities during initialization. These map to specific decorators:\n\n| Capability  | Feature Flag                 | Decorators                                                      | Description                        |\n|-------------|------------------------------|-----------------------------------------------------------------|-------------------------------------|\n| `prompts`   | `listChanged`                | `@list_prompts`<br/>`@get_prompt`                               | Prompt template management          |\n| `resources` | `subscribe`<br/>`listChanged`| `@list_resources`<br/>`@read_resource`<br/>`@subscribe_resource`| Resource exposure and updates       |\n| `tools`     | `listChanged`                | `@list_tools`<br/>`@call_tool`                                  | Tool discovery and execution        |\n| `logging`   | -                            | `@set_logging_level`                                            | Server logging configuration        |\n| `completion`| -                            | `@complete_argument`                                            | Argument completion suggestions     |\n\nCapabilities are negotiated during connection initialization. Servers only need to implement the decorators for capabilities they support.\n\n## Client Interaction\n\nThe MCP Python SDK enables servers to interact with clients through request context and session management. This allows servers to perform operations like LLM sampling and progress tracking.\n\n### Request Context\n\nThe Request Context provides access to the current request and client session. It can be accessed through `server.request_context` and enables:\n\n- Sampling from the client's LLM\n- Sending progress updates\n- Logging messages\n- Accessing request metadata\n\nExample using request context for LLM sampling:\n\n```python\n@server.call_tool()\nasync def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:\n    # Access the current request context\n    context = server.request_context\n\n    # Use the session to sample from the client's LLM\n    result = await context.session.create_message(\n        messages=[\n            types.SamplingMessage(\n                role=\"user\",\n                content=types.TextContent(\n                    type=\"text\",\n                    text=\"Analyze this data: \" + json.dumps(arguments)\n                )\n            )\n        ],\n        max_tokens=100\n    )\n\n    return [types.TextContent(type=\"text\", text=result.content.text)]\n```\n\nUsing request context for progress updates:\n\n```python\n@server.call_tool()\nasync def handle_call_tool(name: str, arguments: dict) -> list[types.TextContent]:\n    context = server.request_context\n\n    if progress_token := context.meta.progressToken:\n        # Send progress notifications\n        await context.session.send_progress_notification(\n            progress_token=progress_token,\n            progress=0.5,\n            total=1.0\n        )\n\n    # Perform operation...\n\n    if progress_token:\n        await context.session.send_progress_notification(\n            progress_token=progress_token,\n            progress=1.0,\n            total=1.0\n        )\n\n    return [types.TextContent(type=\"text\", text=\"Operation complete\")]\n```\n\nThe request context is automatically set for each request and provides a safe way to access the current client session and request metadata.\n\n## Documentation\n\n- [Model Context Protocol documentation](https://modelcontextprotocol.io)\n- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)\n- [Officially supported servers](https://github.com/modelcontextprotocol/servers)\n\n## Contributing\n\nWe are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.\n\n## License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Model Context Protocol SDK",
    "version": "1.1.0",
    "project_urls": {
        "Homepage": "https://modelcontextprotocol.io",
        "Issues": "https://github.com/modelcontextprotocol/python-sdk/issues",
        "Repository": "https://github.com/modelcontextprotocol/python-sdk"
    },
    "split_keywords": [
        "automation",
        " git",
        " llm",
        " mcp"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b93eaef19ac08a6f9a347c086c4e628c2f7329659828cbe92ffd524ec2aac833",
                "md5": "6e45e0e0a38b36892a291fc7418bdde6",
                "sha256": "44aa4d2e541f0924d6c344aa7f96b427a6ee1df2fab70b5f9ae2f8777b3f05f2"
            },
            "downloads": -1,
            "filename": "mcp-1.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6e45e0e0a38b36892a291fc7418bdde6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 36576,
            "upload_time": "2024-12-03T22:39:17",
            "upload_time_iso_8601": "2024-12-03T22:39:17.880529Z",
            "url": "https://files.pythonhosted.org/packages/b9/3e/aef19ac08a6f9a347c086c4e628c2f7329659828cbe92ffd524ec2aac833/mcp-1.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "77f2067b1fc114e8d3ae4af02fc4f4ed8971a2c4900362d976fabe0f4e9a3418",
                "md5": "be22d888a241d06635255d77115c07c1",
                "sha256": "e3c8d6df93a4de90230ea944dd667730744a3cd91a4cc0ee66a5acd53419e100"
            },
            "downloads": -1,
            "filename": "mcp-1.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "be22d888a241d06635255d77115c07c1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 83802,
            "upload_time": "2024-12-03T22:39:19",
            "upload_time_iso_8601": "2024-12-03T22:39:19.157208Z",
            "url": "https://files.pythonhosted.org/packages/77/f2/067b1fc114e8d3ae4af02fc4f4ed8971a2c4900362d976fabe0f4e9a3418/mcp-1.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-03 22:39:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "modelcontextprotocol",
    "github_project": "python-sdk",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mcp"
}
        
Elapsed time: 1.81987s