mcp


Namemcp JSON
Version 1.2.0 PyPI version JSON
download
home_pageNone
SummaryModel Context Protocol SDK
upload_time2025-01-03 16:23:39
maintainerNone
docs_urlNone
authorAnthropic, PBC.
requires_python>=3.10
licenseMIT
keywords automation git llm mcp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MCP Python SDK

<div align="center">

<strong>Python implementation of the Model Context Protocol (MCP)</strong>

[![PyPI][pypi-badge]][pypi-url]
[![MIT licensed][mit-badge]][mit-url]
[![Python Version][python-badge]][python-url]
[![Documentation][docs-badge]][docs-url]
[![Specification][spec-badge]][spec-url]
[![GitHub Discussions][discussions-badge]][discussions-url]

</div>

<!-- omit in toc -->
## Table of Contents

- [Overview](#overview)
- [Installation](#installation)
- [Quickstart](#quickstart)
- [What is MCP?](#what-is-mcp)
- [Core Concepts](#core-concepts)
  - [Server](#server)
  - [Resources](#resources)
  - [Tools](#tools)
  - [Prompts](#prompts)
  - [Images](#images)
  - [Context](#context)
- [Running Your Server](#running-your-server)
  - [Development Mode](#development-mode)
  - [Claude Desktop Integration](#claude-desktop-integration)
  - [Direct Execution](#direct-execution)
- [Examples](#examples)
  - [Echo Server](#echo-server)
  - [SQLite Explorer](#sqlite-explorer)
- [Advanced Usage](#advanced-usage)
  - [Low-Level Server](#low-level-server)
  - [Writing MCP Clients](#writing-mcp-clients)
  - [MCP Primitives](#mcp-primitives)
  - [Server Capabilities](#server-capabilities)
- [Documentation](#documentation)
- [Contributing](#contributing)
- [License](#license)

[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg
[pypi-url]: https://pypi.org/project/mcp/
[mit-badge]: https://img.shields.io/pypi/l/mcp.svg
[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE
[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg
[python-url]: https://www.python.org/downloads/
[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg
[docs-url]: https://modelcontextprotocol.io
[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg
[spec-url]: https://spec.modelcontextprotocol.io
[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk
[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions

## Overview

The Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:

- Build MCP clients that can connect to any MCP server
- Create MCP servers that expose resources, prompts and tools
- Use standard transports like stdio and SSE
- Handle all MCP protocol messages and lifecycle events

## Installation

We recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects:

```bash
uv add "mcp[cli]"
```

Alternatively:
```bash
pip install mcp
```

## Quickstart

Let's create a simple MCP server that exposes a calculator tool and some data:

```python
# server.py
from mcp.server.fastmcp import FastMCP

# Create an MCP server
mcp = FastMCP("Demo")

# Add an addition tool
@mcp.tool()
def add(a: int, b: int) -> int:
    """Add two numbers"""
    return a + b

# Add a dynamic greeting resource
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
    """Get a personalized greeting"""
    return f"Hello, {name}!"
```

You can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:
```bash
mcp install server.py
```

Alternatively, you can test it with the MCP Inspector:
```bash
mcp dev server.py
```

## What is MCP?

The [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:

- Expose data through **Resources** (think of these sort of like GET endpoints; they are used to load information into the LLM's context)
- Provide functionality through **Tools** (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)
- Define interaction patterns through **Prompts** (reusable templates for LLM interactions)
- And more!

## Core Concepts

### Server

The FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:

```python
from mcp.server.fastmcp import FastMCP

# Create a named server
mcp = FastMCP("My App")

# Specify dependencies for deployment and development
mcp = FastMCP("My App", dependencies=["pandas", "numpy"])
```

### Resources

Resources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:

```python
@mcp.resource("config://app")
def get_config() -> str:
    """Static configuration data"""
    return "App configuration here"

@mcp.resource("users://{user_id}/profile")
def get_user_profile(user_id: str) -> str:
    """Dynamic user data"""
    return f"Profile data for user {user_id}"
```

### Tools

Tools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:

```python
@mcp.tool()
def calculate_bmi(weight_kg: float, height_m: float) -> float:
    """Calculate BMI given weight in kg and height in meters"""
    return weight_kg / (height_m ** 2)

@mcp.tool()
async def fetch_weather(city: str) -> str:
    """Fetch current weather for a city"""
    async with httpx.AsyncClient() as client:
        response = await client.get(f"https://api.weather.com/{city}")
        return response.text
```

### Prompts

Prompts are reusable templates that help LLMs interact with your server effectively:

```python
@mcp.prompt()
def review_code(code: str) -> str:
    return f"Please review this code:\n\n{code}"

@mcp.prompt()
def debug_error(error: str) -> list[Message]:
    return [
        UserMessage("I'm seeing this error:"),
        UserMessage(error),
        AssistantMessage("I'll help debug that. What have you tried so far?")
    ]
```

### Images

FastMCP provides an `Image` class that automatically handles image data:

```python
from mcp.server.fastmcp import FastMCP, Image
from PIL import Image as PILImage

@mcp.tool()
def create_thumbnail(image_path: str) -> Image:
    """Create a thumbnail from an image"""
    img = PILImage.open(image_path)
    img.thumbnail((100, 100))
    return Image(data=img.tobytes(), format="png")
```

### Context

The Context object gives your tools and resources access to MCP capabilities:

```python
from mcp.server.fastmcp import FastMCP, Context

@mcp.tool()
async def long_task(files: list[str], ctx: Context) -> str:
    """Process multiple files with progress tracking"""
    for i, file in enumerate(files):
        ctx.info(f"Processing {file}")
        await ctx.report_progress(i, len(files))
        data = await ctx.read_resource(f"file://{file}")
    return "Processing complete"
```

## Running Your Server

### Development Mode

The fastest way to test and debug your server is with the MCP Inspector:

```bash
mcp dev server.py

# Add dependencies
mcp dev server.py --with pandas --with numpy

# Mount local code
mcp dev server.py --with-editable .
```

### Claude Desktop Integration

Once your server is ready, install it in Claude Desktop:

```bash
mcp install server.py

# Custom name
mcp install server.py --name "My Analytics Server"

# Environment variables
mcp install server.py -e API_KEY=abc123 -e DB_URL=postgres://...
mcp install server.py -f .env
```

### Direct Execution

For advanced scenarios like custom deployments:

```python
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("My App")

if __name__ == "__main__":
    mcp.run()
```

Run it with:
```bash
python server.py
# or
mcp run server.py
```

## Examples

### Echo Server

A simple server demonstrating resources, tools, and prompts:

```python
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("Echo")

@mcp.resource("echo://{message}")
def echo_resource(message: str) -> str:
    """Echo a message as a resource"""
    return f"Resource echo: {message}"

@mcp.tool()
def echo_tool(message: str) -> str:
    """Echo a message as a tool"""
    return f"Tool echo: {message}"

@mcp.prompt()
def echo_prompt(message: str) -> str:
    """Create an echo prompt"""
    return f"Please process this message: {message}"
```

### SQLite Explorer

A more complex example showing database integration:

```python
from mcp.server.fastmcp import FastMCP
import sqlite3

mcp = FastMCP("SQLite Explorer")

@mcp.resource("schema://main")
def get_schema() -> str:
    """Provide the database schema as a resource"""
    conn = sqlite3.connect("database.db")
    schema = conn.execute(
        "SELECT sql FROM sqlite_master WHERE type='table'"
    ).fetchall()
    return "\n".join(sql[0] for sql in schema if sql[0])

@mcp.tool()
def query_data(sql: str) -> str:
    """Execute SQL queries safely"""
    conn = sqlite3.connect("database.db")
    try:
        result = conn.execute(sql).fetchall()
        return "\n".join(str(row) for row in result)
    except Exception as e:
        return f"Error: {str(e)}"
```

## Advanced Usage

### Low-Level Server

For more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server:

```python
from mcp.server.lowlevel import Server, NotificationOptions
from mcp.server.models import InitializationOptions
import mcp.server.stdio
import mcp.types as types

# Create a server instance
server = Server("example-server")

@server.list_prompts()
async def handle_list_prompts() -> list[types.Prompt]:
    return [
        types.Prompt(
            name="example-prompt",
            description="An example prompt template",
            arguments=[
                types.PromptArgument(
                    name="arg1",
                    description="Example argument",
                    required=True
                )
            ]
        )
    ]

@server.get_prompt()
async def handle_get_prompt(
    name: str,
    arguments: dict[str, str] | None
) -> types.GetPromptResult:
    if name != "example-prompt":
        raise ValueError(f"Unknown prompt: {name}")

    return types.GetPromptResult(
        description="Example prompt",
        messages=[
            types.PromptMessage(
                role="user",
                content=types.TextContent(
                    type="text",
                    text="Example prompt text"
                )
            )
        ]
    )

async def run():
    async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
        await server.run(
            read_stream,
            write_stream,
            InitializationOptions(
                server_name="example",
                server_version="0.1.0",
                capabilities=server.get_capabilities(
                    notification_options=NotificationOptions(),
                    experimental_capabilities={},
                )
            )
        )

if __name__ == "__main__":
    import asyncio
    asyncio.run(run())
```

### Writing MCP Clients

The SDK provides a high-level client interface for connecting to MCP servers:

```python
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

# Create server parameters for stdio connection
server_params = StdioServerParameters(
    command="python", # Executable
    args=["example_server.py"], # Optional command line arguments
    env=None # Optional environment variables
)

async def run():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the connection
            await session.initialize()

            # List available prompts
            prompts = await session.list_prompts()

            # Get a prompt
            prompt = await session.get_prompt("example-prompt", arguments={"arg1": "value"})

            # List available resources
            resources = await session.list_resources()

            # List available tools
            tools = await session.list_tools()

            # Read a resource
            resource = await session.read_resource("file://some/path")

            # Call a tool
            result = await session.call_tool("tool-name", arguments={"arg1": "value"})

if __name__ == "__main__":
    import asyncio
    asyncio.run(run())
```

### MCP Primitives

The MCP protocol defines three core primitives that servers can implement:

| Primitive | Control               | Description                                         | Example Use                  |
|-----------|-----------------------|-----------------------------------------------------|------------------------------|
| Prompts   | User-controlled       | Interactive templates invoked by user choice        | Slash commands, menu options |
| Resources | Application-controlled| Contextual data managed by the client application   | File contents, API responses |
| Tools     | Model-controlled      | Functions exposed to the LLM to take actions        | API calls, data updates      |

### Server Capabilities

MCP servers declare capabilities during initialization:

| Capability  | Feature Flag                 | Description                        |
|-------------|------------------------------|------------------------------------|
| `prompts`   | `listChanged`                | Prompt template management         |
| `resources` | `subscribe`<br/>`listChanged`| Resource exposure and updates      |
| `tools`     | `listChanged`                | Tool discovery and execution       |
| `logging`   | -                            | Server logging configuration       |
| `completion`| -                            | Argument completion suggestions    |

## Documentation

- [Model Context Protocol documentation](https://modelcontextprotocol.io)
- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)
- [Officially supported servers](https://github.com/modelcontextprotocol/servers)

## Contributing

We are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.

## License

This project is licensed under the MIT License - see the LICENSE file for details.
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mcp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "David Soria Parra <davidsp@anthropic.com>, Justin Spahr-Summers <justin@anthropic.com>",
    "keywords": "automation, git, llm, mcp",
    "author": "Anthropic, PBC.",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ab/a5/b08dc846ebedae9f17ced878e6975826e90e448cd4592f532f6a88a925a7/mcp-1.2.0.tar.gz",
    "platform": null,
    "description": "# MCP Python SDK\n\n<div align=\"center\">\n\n<strong>Python implementation of the Model Context Protocol (MCP)</strong>\n\n[![PyPI][pypi-badge]][pypi-url]\n[![MIT licensed][mit-badge]][mit-url]\n[![Python Version][python-badge]][python-url]\n[![Documentation][docs-badge]][docs-url]\n[![Specification][spec-badge]][spec-url]\n[![GitHub Discussions][discussions-badge]][discussions-url]\n\n</div>\n\n<!-- omit in toc -->\n## Table of Contents\n\n- [Overview](#overview)\n- [Installation](#installation)\n- [Quickstart](#quickstart)\n- [What is MCP?](#what-is-mcp)\n- [Core Concepts](#core-concepts)\n  - [Server](#server)\n  - [Resources](#resources)\n  - [Tools](#tools)\n  - [Prompts](#prompts)\n  - [Images](#images)\n  - [Context](#context)\n- [Running Your Server](#running-your-server)\n  - [Development Mode](#development-mode)\n  - [Claude Desktop Integration](#claude-desktop-integration)\n  - [Direct Execution](#direct-execution)\n- [Examples](#examples)\n  - [Echo Server](#echo-server)\n  - [SQLite Explorer](#sqlite-explorer)\n- [Advanced Usage](#advanced-usage)\n  - [Low-Level Server](#low-level-server)\n  - [Writing MCP Clients](#writing-mcp-clients)\n  - [MCP Primitives](#mcp-primitives)\n  - [Server Capabilities](#server-capabilities)\n- [Documentation](#documentation)\n- [Contributing](#contributing)\n- [License](#license)\n\n[pypi-badge]: https://img.shields.io/pypi/v/mcp.svg\n[pypi-url]: https://pypi.org/project/mcp/\n[mit-badge]: https://img.shields.io/pypi/l/mcp.svg\n[mit-url]: https://github.com/modelcontextprotocol/python-sdk/blob/main/LICENSE\n[python-badge]: https://img.shields.io/pypi/pyversions/mcp.svg\n[python-url]: https://www.python.org/downloads/\n[docs-badge]: https://img.shields.io/badge/docs-modelcontextprotocol.io-blue.svg\n[docs-url]: https://modelcontextprotocol.io\n[spec-badge]: https://img.shields.io/badge/spec-spec.modelcontextprotocol.io-blue.svg\n[spec-url]: https://spec.modelcontextprotocol.io\n[discussions-badge]: https://img.shields.io/github/discussions/modelcontextprotocol/python-sdk\n[discussions-url]: https://github.com/modelcontextprotocol/python-sdk/discussions\n\n## Overview\n\nThe Model Context Protocol allows applications to provide context for LLMs in a standardized way, separating the concerns of providing context from the actual LLM interaction. This Python SDK implements the full MCP specification, making it easy to:\n\n- Build MCP clients that can connect to any MCP server\n- Create MCP servers that expose resources, prompts and tools\n- Use standard transports like stdio and SSE\n- Handle all MCP protocol messages and lifecycle events\n\n## Installation\n\nWe recommend using [uv](https://docs.astral.sh/uv/) to manage your Python projects:\n\n```bash\nuv add \"mcp[cli]\"\n```\n\nAlternatively:\n```bash\npip install mcp\n```\n\n## Quickstart\n\nLet's create a simple MCP server that exposes a calculator tool and some data:\n\n```python\n# server.py\nfrom mcp.server.fastmcp import FastMCP\n\n# Create an MCP server\nmcp = FastMCP(\"Demo\")\n\n# Add an addition tool\n@mcp.tool()\ndef add(a: int, b: int) -> int:\n    \"\"\"Add two numbers\"\"\"\n    return a + b\n\n# Add a dynamic greeting resource\n@mcp.resource(\"greeting://{name}\")\ndef get_greeting(name: str) -> str:\n    \"\"\"Get a personalized greeting\"\"\"\n    return f\"Hello, {name}!\"\n```\n\nYou can install this server in [Claude Desktop](https://claude.ai/download) and interact with it right away by running:\n```bash\nmcp install server.py\n```\n\nAlternatively, you can test it with the MCP Inspector:\n```bash\nmcp dev server.py\n```\n\n## What is MCP?\n\nThe [Model Context Protocol (MCP)](https://modelcontextprotocol.io) lets you build servers that expose data and functionality to LLM applications in a secure, standardized way. Think of it like a web API, but specifically designed for LLM interactions. MCP servers can:\n\n- Expose data through **Resources** (think of these sort of like GET endpoints; they are used to load information into the LLM's context)\n- Provide functionality through **Tools** (sort of like POST endpoints; they are used to execute code or otherwise produce a side effect)\n- Define interaction patterns through **Prompts** (reusable templates for LLM interactions)\n- And more!\n\n## Core Concepts\n\n### Server\n\nThe FastMCP server is your core interface to the MCP protocol. It handles connection management, protocol compliance, and message routing:\n\n```python\nfrom mcp.server.fastmcp import FastMCP\n\n# Create a named server\nmcp = FastMCP(\"My App\")\n\n# Specify dependencies for deployment and development\nmcp = FastMCP(\"My App\", dependencies=[\"pandas\", \"numpy\"])\n```\n\n### Resources\n\nResources are how you expose data to LLMs. They're similar to GET endpoints in a REST API - they provide data but shouldn't perform significant computation or have side effects:\n\n```python\n@mcp.resource(\"config://app\")\ndef get_config() -> str:\n    \"\"\"Static configuration data\"\"\"\n    return \"App configuration here\"\n\n@mcp.resource(\"users://{user_id}/profile\")\ndef get_user_profile(user_id: str) -> str:\n    \"\"\"Dynamic user data\"\"\"\n    return f\"Profile data for user {user_id}\"\n```\n\n### Tools\n\nTools let LLMs take actions through your server. Unlike resources, tools are expected to perform computation and have side effects:\n\n```python\n@mcp.tool()\ndef calculate_bmi(weight_kg: float, height_m: float) -> float:\n    \"\"\"Calculate BMI given weight in kg and height in meters\"\"\"\n    return weight_kg / (height_m ** 2)\n\n@mcp.tool()\nasync def fetch_weather(city: str) -> str:\n    \"\"\"Fetch current weather for a city\"\"\"\n    async with httpx.AsyncClient() as client:\n        response = await client.get(f\"https://api.weather.com/{city}\")\n        return response.text\n```\n\n### Prompts\n\nPrompts are reusable templates that help LLMs interact with your server effectively:\n\n```python\n@mcp.prompt()\ndef review_code(code: str) -> str:\n    return f\"Please review this code:\\n\\n{code}\"\n\n@mcp.prompt()\ndef debug_error(error: str) -> list[Message]:\n    return [\n        UserMessage(\"I'm seeing this error:\"),\n        UserMessage(error),\n        AssistantMessage(\"I'll help debug that. What have you tried so far?\")\n    ]\n```\n\n### Images\n\nFastMCP provides an `Image` class that automatically handles image data:\n\n```python\nfrom mcp.server.fastmcp import FastMCP, Image\nfrom PIL import Image as PILImage\n\n@mcp.tool()\ndef create_thumbnail(image_path: str) -> Image:\n    \"\"\"Create a thumbnail from an image\"\"\"\n    img = PILImage.open(image_path)\n    img.thumbnail((100, 100))\n    return Image(data=img.tobytes(), format=\"png\")\n```\n\n### Context\n\nThe Context object gives your tools and resources access to MCP capabilities:\n\n```python\nfrom mcp.server.fastmcp import FastMCP, Context\n\n@mcp.tool()\nasync def long_task(files: list[str], ctx: Context) -> str:\n    \"\"\"Process multiple files with progress tracking\"\"\"\n    for i, file in enumerate(files):\n        ctx.info(f\"Processing {file}\")\n        await ctx.report_progress(i, len(files))\n        data = await ctx.read_resource(f\"file://{file}\")\n    return \"Processing complete\"\n```\n\n## Running Your Server\n\n### Development Mode\n\nThe fastest way to test and debug your server is with the MCP Inspector:\n\n```bash\nmcp dev server.py\n\n# Add dependencies\nmcp dev server.py --with pandas --with numpy\n\n# Mount local code\nmcp dev server.py --with-editable .\n```\n\n### Claude Desktop Integration\n\nOnce your server is ready, install it in Claude Desktop:\n\n```bash\nmcp install server.py\n\n# Custom name\nmcp install server.py --name \"My Analytics Server\"\n\n# Environment variables\nmcp install server.py -e API_KEY=abc123 -e DB_URL=postgres://...\nmcp install server.py -f .env\n```\n\n### Direct Execution\n\nFor advanced scenarios like custom deployments:\n\n```python\nfrom mcp.server.fastmcp import FastMCP\n\nmcp = FastMCP(\"My App\")\n\nif __name__ == \"__main__\":\n    mcp.run()\n```\n\nRun it with:\n```bash\npython server.py\n# or\nmcp run server.py\n```\n\n## Examples\n\n### Echo Server\n\nA simple server demonstrating resources, tools, and prompts:\n\n```python\nfrom mcp.server.fastmcp import FastMCP\n\nmcp = FastMCP(\"Echo\")\n\n@mcp.resource(\"echo://{message}\")\ndef echo_resource(message: str) -> str:\n    \"\"\"Echo a message as a resource\"\"\"\n    return f\"Resource echo: {message}\"\n\n@mcp.tool()\ndef echo_tool(message: str) -> str:\n    \"\"\"Echo a message as a tool\"\"\"\n    return f\"Tool echo: {message}\"\n\n@mcp.prompt()\ndef echo_prompt(message: str) -> str:\n    \"\"\"Create an echo prompt\"\"\"\n    return f\"Please process this message: {message}\"\n```\n\n### SQLite Explorer\n\nA more complex example showing database integration:\n\n```python\nfrom mcp.server.fastmcp import FastMCP\nimport sqlite3\n\nmcp = FastMCP(\"SQLite Explorer\")\n\n@mcp.resource(\"schema://main\")\ndef get_schema() -> str:\n    \"\"\"Provide the database schema as a resource\"\"\"\n    conn = sqlite3.connect(\"database.db\")\n    schema = conn.execute(\n        \"SELECT sql FROM sqlite_master WHERE type='table'\"\n    ).fetchall()\n    return \"\\n\".join(sql[0] for sql in schema if sql[0])\n\n@mcp.tool()\ndef query_data(sql: str) -> str:\n    \"\"\"Execute SQL queries safely\"\"\"\n    conn = sqlite3.connect(\"database.db\")\n    try:\n        result = conn.execute(sql).fetchall()\n        return \"\\n\".join(str(row) for row in result)\n    except Exception as e:\n        return f\"Error: {str(e)}\"\n```\n\n## Advanced Usage\n\n### Low-Level Server\n\nFor more control, you can use the low-level server implementation directly. This gives you full access to the protocol and allows you to customize every aspect of your server:\n\n```python\nfrom mcp.server.lowlevel import Server, NotificationOptions\nfrom mcp.server.models import InitializationOptions\nimport mcp.server.stdio\nimport mcp.types as types\n\n# Create a server instance\nserver = Server(\"example-server\")\n\n@server.list_prompts()\nasync def handle_list_prompts() -> list[types.Prompt]:\n    return [\n        types.Prompt(\n            name=\"example-prompt\",\n            description=\"An example prompt template\",\n            arguments=[\n                types.PromptArgument(\n                    name=\"arg1\",\n                    description=\"Example argument\",\n                    required=True\n                )\n            ]\n        )\n    ]\n\n@server.get_prompt()\nasync def handle_get_prompt(\n    name: str,\n    arguments: dict[str, str] | None\n) -> types.GetPromptResult:\n    if name != \"example-prompt\":\n        raise ValueError(f\"Unknown prompt: {name}\")\n\n    return types.GetPromptResult(\n        description=\"Example prompt\",\n        messages=[\n            types.PromptMessage(\n                role=\"user\",\n                content=types.TextContent(\n                    type=\"text\",\n                    text=\"Example prompt text\"\n                )\n            )\n        ]\n    )\n\nasync def run():\n    async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):\n        await server.run(\n            read_stream,\n            write_stream,\n            InitializationOptions(\n                server_name=\"example\",\n                server_version=\"0.1.0\",\n                capabilities=server.get_capabilities(\n                    notification_options=NotificationOptions(),\n                    experimental_capabilities={},\n                )\n            )\n        )\n\nif __name__ == \"__main__\":\n    import asyncio\n    asyncio.run(run())\n```\n\n### Writing MCP Clients\n\nThe SDK provides a high-level client interface for connecting to MCP servers:\n\n```python\nfrom mcp import ClientSession, StdioServerParameters\nfrom mcp.client.stdio import stdio_client\n\n# Create server parameters for stdio connection\nserver_params = StdioServerParameters(\n    command=\"python\", # Executable\n    args=[\"example_server.py\"], # Optional command line arguments\n    env=None # Optional environment variables\n)\n\nasync def run():\n    async with stdio_client(server_params) as (read, write):\n        async with ClientSession(read, write) as session:\n            # Initialize the connection\n            await session.initialize()\n\n            # List available prompts\n            prompts = await session.list_prompts()\n\n            # Get a prompt\n            prompt = await session.get_prompt(\"example-prompt\", arguments={\"arg1\": \"value\"})\n\n            # List available resources\n            resources = await session.list_resources()\n\n            # List available tools\n            tools = await session.list_tools()\n\n            # Read a resource\n            resource = await session.read_resource(\"file://some/path\")\n\n            # Call a tool\n            result = await session.call_tool(\"tool-name\", arguments={\"arg1\": \"value\"})\n\nif __name__ == \"__main__\":\n    import asyncio\n    asyncio.run(run())\n```\n\n### MCP Primitives\n\nThe MCP protocol defines three core primitives that servers can implement:\n\n| Primitive | Control               | Description                                         | Example Use                  |\n|-----------|-----------------------|-----------------------------------------------------|------------------------------|\n| Prompts   | User-controlled       | Interactive templates invoked by user choice        | Slash commands, menu options |\n| Resources | Application-controlled| Contextual data managed by the client application   | File contents, API responses |\n| Tools     | Model-controlled      | Functions exposed to the LLM to take actions        | API calls, data updates      |\n\n### Server Capabilities\n\nMCP servers declare capabilities during initialization:\n\n| Capability  | Feature Flag                 | Description                        |\n|-------------|------------------------------|------------------------------------|\n| `prompts`   | `listChanged`                | Prompt template management         |\n| `resources` | `subscribe`<br/>`listChanged`| Resource exposure and updates      |\n| `tools`     | `listChanged`                | Tool discovery and execution       |\n| `logging`   | -                            | Server logging configuration       |\n| `completion`| -                            | Argument completion suggestions    |\n\n## Documentation\n\n- [Model Context Protocol documentation](https://modelcontextprotocol.io)\n- [Model Context Protocol specification](https://spec.modelcontextprotocol.io)\n- [Officially supported servers](https://github.com/modelcontextprotocol/servers)\n\n## Contributing\n\nWe are passionate about supporting contributors of all levels of experience and would love to see you get involved in the project. See the [contributing guide](CONTRIBUTING.md) to get started.\n\n## License\n\nThis project is licensed under the MIT License - see the LICENSE file for details.",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Model Context Protocol SDK",
    "version": "1.2.0",
    "project_urls": {
        "Homepage": "https://modelcontextprotocol.io",
        "Issues": "https://github.com/modelcontextprotocol/python-sdk/issues",
        "Repository": "https://github.com/modelcontextprotocol/python-sdk"
    },
    "split_keywords": [
        "automation",
        " git",
        " llm",
        " mcp"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "af84fca78f19ac8ce6c53ba416247c71baa53a9e791e98d3c81edbc20a77d6d1",
                "md5": "86e2bb808d621075fa7a003bdecb29c1",
                "sha256": "1d0e77d8c14955a5aea1f5aa1f444c8e531c09355c829b20e42f7a142bc0755f"
            },
            "downloads": -1,
            "filename": "mcp-1.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "86e2bb808d621075fa7a003bdecb29c1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 66468,
            "upload_time": "2025-01-03T16:23:36",
            "upload_time_iso_8601": "2025-01-03T16:23:36.863122Z",
            "url": "https://files.pythonhosted.org/packages/af/84/fca78f19ac8ce6c53ba416247c71baa53a9e791e98d3c81edbc20a77d6d1/mcp-1.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aba5b08dc846ebedae9f17ced878e6975826e90e448cd4592f532f6a88a925a7",
                "md5": "da8ab883d69a567dab52c0398f5da02f",
                "sha256": "2b06c7ece98d6ea9e6379caa38d74b432385c338fb530cb82e2c70ea7add94f5"
            },
            "downloads": -1,
            "filename": "mcp-1.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "da8ab883d69a567dab52c0398f5da02f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 102973,
            "upload_time": "2025-01-03T16:23:39",
            "upload_time_iso_8601": "2025-01-03T16:23:39.532156Z",
            "url": "https://files.pythonhosted.org/packages/ab/a5/b08dc846ebedae9f17ced878e6975826e90e448cd4592f532f6a88a925a7/mcp-1.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-03 16:23:39",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "modelcontextprotocol",
    "github_project": "python-sdk",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mcp"
}
        
Elapsed time: 0.46180s