Name | http-mcp JSON |
Version |
0.3.0
JSON |
| download |
home_page | None |
Summary | This is a HTTP implementation of the MCP protocol |
upload_time | 2025-08-24 16:27:35 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
automation
http
llm
mcp
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Simple HTTP MCP Server Implementation
This project provides a lightweight server implementation for the Model Context
Protocol (MCP) over HTTP. It allows you to expose Python functions as tools and
prompts that can be discovered and executed remotely via a JSON-RPC interface.
It is intended to be used with a Starlette or FastAPI application (see
[demo](https://github.com/yeison-liscano/demo_http_mcp)).
The following badge corresponds to the example server for this project. Find it
in the [tests/app/ folder](tests/app).
<a href="https://glama.ai/mcp/servers/@yeison-liscano/http_mcp">
<img width="380" height="200" src="https://glama.ai/mcp/servers/@yeison-liscano/http_mcp/badge" alt="Simple HTTP Server MCP server" />
</a>
## Features
- **MCP Protocol Compliant**: Implements the MCP specification for tool and
prompts discovery and execution. No support for notifications.
- **HTTP and STDIO Transport**: Uses HTTP (POST requests) or STDIO for
communication.
- **Async Support**: Built on `Starlette` or `FastAPI` for asynchronous request
handling.
- **Type-Safe**: Leverages `Pydantic` for robust data validation and
serialization.
- **Server State Management**: Access shared state through the lifespan context
using the `get_state_key` method.
- **Request Access**: Access the incoming request object from your tools and
prompts.
## Server Architecture
The library provides a single `MCPServer` class that uses lifespan to manage
shared state across the entire application lifecycle.
### MCPServer
The `MCPServer` is designed to work with Starlette's lifespan system for
managing shared server state.
**Key Characteristics:**
- **Lifespan Based**: Uses Starlette's lifespan events to initialize and manage
shared server state
- **Application-Level State**: State persists across the entire application
lifecycle, not per-request
- **Flexible**: Can be used with any custom context class stored in the lifespan
state
**Example Usage:**
```python
import contextlib
from collections.abc import AsyncIterator
from typing import TypedDict
from dataclasses import dataclass, field
from starlette.applications import Starlette
from http_mcp.server import MCPServer
@dataclass
class Context:
call_count: int = 0
user_preferences: dict = field(default_factory=dict)
class State(TypedDict):
context: Context
@contextlib.asynccontextmanager
async def lifespan(_app: Starlette) -> AsyncIterator[State]:
yield {"context": Context()}
mcp_server = MCPServer(
name="my-server",
version="1.0.0",
tools=my_tools,
prompts=my_prompts
)
app = Starlette(lifespan=lifespan)
app.mount("/mcp", mcp_server.app)
```
## Tools
Tools are the functions that can be called by the client.
Example:
1. **Define the arguments and output for the tools:**
```python
# app/tools/models.py
from pydantic import BaseModel, Field
class GreetInput(BaseModel):
question: str = Field(description="The question to answer")
class GreetOutput(BaseModel):
answer: str = Field(description="The answer to the question")
# Note: the description on Field will be passed when listing the tools.
# Having a description is optional, but it's recommended to provide one.
```
2. **Define the tools:**
```python
# app/tools/tools.py
from http_mcp.types import Arguments
from app.tools.models import GreetInput, GreetOutput
def greet(args: Arguments[GreetInput]) -> GreetOutput:
return GreetOutput(answer=f"Hello, {args.inputs.question}!")
```
```python
# app/tools/__init__.py
from http_mcp.types import Tool
from app.tools.models import GreetInput, GreetOutput
from app.tools.tools import greet
TOOLS = (
Tool(
func=greet,
inputs=GreetInput,
output=GreetOutput,
),
)
__all__ = ["TOOLS"]
```
3. **Instantiate the server:**
```python
# app/main.py
from starlette.applications import Starlette
from http_mcp.server import MCPServer
from app.tools import TOOLS
mcp_server = MCPServer(tools=TOOLS, name="test", version="1.0.0")
app = Starlette()
app.mount(
"/mcp",
mcp_server.app,
)
```
## Server State Management
The server uses Starlette's lifespan system to manage shared state across the
entire application lifecycle. State is initialized when the application starts
and persists until it shuts down. Context is accessed through the
`get_state_key` method on the `Arguments` object.
Example:
1. **Define a context class:**
```python
from dataclasses import dataclass, field
# app/context.py
@dataclass
class Context:
called_tools: list[str] = field(default_factory=list)
def get_called_tools(self) -> list[str]:
return self.called_tools
def add_called_tool(self, tool_name: str) -> None:
self.called_tools.append(tool_name)
```
2. **Set up the application with lifespan:**
```python
import contextlib
from collections.abc import AsyncIterator
from typing import TypedDict
from starlette.applications import Starlette
from app.context import Context
from http_mcp.server import MCPServer
class State(TypedDict):
context: Context
@contextlib.asynccontextmanager
async def lifespan(_app: Starlette) -> AsyncIterator[State]:
yield {"context": Context(called_tools=[])}
mcp_server = MCPServer(
tools=TOOLS,
name="test",
version="1.0.0",
)
app = Starlette(lifespan=lifespan)
app.mount("/mcp", mcp_server.app)
```
3. **Access the context in your tools:**
```python
from pydantic import BaseModel, Field
from http_mcp.types import Arguments
from app.context import Context
class MyToolArguments(BaseModel):
question: str = Field(description="The question to answer")
class MyToolOutput(BaseModel):
answer: str = Field(description="The answer to the question")
async def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput:
# Access the context from lifespan state
context = args.get_state_key("context", Context)
context.add_called_tool("my_tool")
...
return MyToolOutput(answer=f"Hello, {args.inputs.question}!")
```
## Request Access
You can access the incoming request object from your tools. The request object
is passed to each tool call and can be used to access headers, cookies, and
other request data (e.g. request.state, request.scope).
```python
from pydantic import BaseModel, Field
from http_mcp.types import Arguments
class MyToolArguments(BaseModel):
question: str = Field(description="The question to answer")
class MyToolOutput(BaseModel):
answer: str = Field(description="The answer to the question")
async def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput:
# Access the request
auth_header = args.request.headers.get("Authorization")
...
return MyToolOutput(answer=f"Hello, {args.inputs.question}!")
# Use MCPServer:
from http_mcp.server import MCPServer
mcp_server = MCPServer(
name="my-server",
version="1.0.0",
tools=(my_tool,),
)
```
## Prompts
You can add interactive templates that are invoked by user choice. Prompts now
support lifespan state access, similar to tools.
1. **Define the arguments for the prompts:**
```python
from pydantic import BaseModel, Field
from http_mcp.mcp_types.content import TextContent
from http_mcp.mcp_types.prompts import PromptMessage
from http_mcp.types import Arguments, Prompt
class GetAdvice(BaseModel):
topic: str = Field(description="The topic to get advice on")
include_actionable_steps: bool = Field(
description="Whether to include actionable steps in the advice", default=False
)
def get_advice(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]:
"""Get advice on a topic."""
template = """
You are a helpful assistant that can give advice on {topic}.
"""
if args.inputs.include_actionable_steps:
template += """
The advice should include actionable steps.
"""
return (
PromptMessage(
role="user",
content=TextContent(
text=template.format(topic=args.inputs.topic)
),
),
)
PROMPTS = (
Prompt(
func=get_advice,
arguments_type=GetAdvice,
),
)
```
2. **Using prompts with lifespan state:**
```python
from pydantic import BaseModel, Field
from http_mcp.mcp_types.content import TextContent
from http_mcp.mcp_types.prompts import PromptMessage
from http_mcp.types import Arguments, Prompt
from app.context import Context
class GetAdvice(BaseModel):
topic: str = Field(description="The topic to get advice on")
def get_advice_with_context(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]:
"""Get advice on a topic with context awareness."""
# Access the context from lifespan state
context = args.get_state_key("context", Context)
called_tools = context.get_called_tools()
template = """
You are a helpful assistant that can give advice on {topic}.
Previously called tools: {tools}
"""
return (
PromptMessage(
role="user",
content=TextContent(
text=template.format(
topic=args.inputs.topic,
tools=", ".join(called_tools) if called_tools else "none"
)
)
),
)
PROMPTS_WITH_CONTEXT = (
Prompt(
func=get_advice_with_context,
arguments_type=GetAdvice,
),
)
```
3. **Instantiate the server:**
```python
from starlette.applications import Starlette
from app.prompts import PROMPTS
from http_mcp.server import MCPServer
app = Starlette()
mcp_server = MCPServer(tools=(), prompts=PROMPTS, name="test", version="1.0.0")
app.mount(
"/mcp",
mcp_server.app,
)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "http-mcp",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Yeison Liscano <yliscanoc@gmail.com>",
"keywords": "automation, http, llm, mcp",
"author": null,
"author_email": "Yeison Liscano <yliscanoc@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/a9/83/03c33511f5dee31ec99095ea4f5c6bb27855ad04f4779de1f6faeb1bcba0/http_mcp-0.3.0.tar.gz",
"platform": null,
"description": "# Simple HTTP MCP Server Implementation\n\nThis project provides a lightweight server implementation for the Model Context\nProtocol (MCP) over HTTP. It allows you to expose Python functions as tools and\nprompts that can be discovered and executed remotely via a JSON-RPC interface.\nIt is intended to be used with a Starlette or FastAPI application (see\n[demo](https://github.com/yeison-liscano/demo_http_mcp)).\n\nThe following badge corresponds to the example server for this project. Find it\nin the [tests/app/ folder](tests/app).\n\n<a href=\"https://glama.ai/mcp/servers/@yeison-liscano/http_mcp\">\n <img width=\"380\" height=\"200\" src=\"https://glama.ai/mcp/servers/@yeison-liscano/http_mcp/badge\" alt=\"Simple HTTP Server MCP server\" />\n</a>\n\n## Features\n\n- **MCP Protocol Compliant**: Implements the MCP specification for tool and\n prompts discovery and execution. No support for notifications.\n- **HTTP and STDIO Transport**: Uses HTTP (POST requests) or STDIO for\n communication.\n- **Async Support**: Built on `Starlette` or `FastAPI` for asynchronous request\n handling.\n- **Type-Safe**: Leverages `Pydantic` for robust data validation and\n serialization.\n- **Server State Management**: Access shared state through the lifespan context\n using the `get_state_key` method.\n- **Request Access**: Access the incoming request object from your tools and\n prompts.\n\n## Server Architecture\n\nThe library provides a single `MCPServer` class that uses lifespan to manage\nshared state across the entire application lifecycle.\n\n### MCPServer\n\nThe `MCPServer` is designed to work with Starlette's lifespan system for\nmanaging shared server state.\n\n**Key Characteristics:**\n\n- **Lifespan Based**: Uses Starlette's lifespan events to initialize and manage\n shared server state\n- **Application-Level State**: State persists across the entire application\n lifecycle, not per-request\n- **Flexible**: Can be used with any custom context class stored in the lifespan\n state\n\n**Example Usage:**\n\n```python\nimport contextlib\nfrom collections.abc import AsyncIterator\nfrom typing import TypedDict\nfrom dataclasses import dataclass, field\nfrom starlette.applications import Starlette\nfrom http_mcp.server import MCPServer\n\n@dataclass\nclass Context:\n call_count: int = 0\n user_preferences: dict = field(default_factory=dict)\n\nclass State(TypedDict):\n context: Context\n\n@contextlib.asynccontextmanager\nasync def lifespan(_app: Starlette) -> AsyncIterator[State]:\n yield {\"context\": Context()}\n\nmcp_server = MCPServer(\n name=\"my-server\",\n version=\"1.0.0\",\n tools=my_tools,\n prompts=my_prompts\n)\n\napp = Starlette(lifespan=lifespan)\napp.mount(\"/mcp\", mcp_server.app)\n```\n\n## Tools\n\nTools are the functions that can be called by the client.\n\nExample:\n\n1. **Define the arguments and output for the tools:**\n\n```python\n# app/tools/models.py\nfrom pydantic import BaseModel, Field\n\nclass GreetInput(BaseModel):\n question: str = Field(description=\"The question to answer\")\n\nclass GreetOutput(BaseModel):\n answer: str = Field(description=\"The answer to the question\")\n\n# Note: the description on Field will be passed when listing the tools.\n# Having a description is optional, but it's recommended to provide one.\n```\n\n2. **Define the tools:**\n\n```python\n# app/tools/tools.py\nfrom http_mcp.types import Arguments\n\nfrom app.tools.models import GreetInput, GreetOutput\n\ndef greet(args: Arguments[GreetInput]) -> GreetOutput:\n return GreetOutput(answer=f\"Hello, {args.inputs.question}!\")\n\n```\n\n```python\n# app/tools/__init__.py\n\nfrom http_mcp.types import Tool\nfrom app.tools.models import GreetInput, GreetOutput\nfrom app.tools.tools import greet\n\nTOOLS = (\n Tool(\n func=greet,\n inputs=GreetInput,\n output=GreetOutput,\n ),\n)\n\n__all__ = [\"TOOLS\"]\n\n```\n\n3. **Instantiate the server:**\n\n```python\n# app/main.py\nfrom starlette.applications import Starlette\nfrom http_mcp.server import MCPServer\nfrom app.tools import TOOLS\n\nmcp_server = MCPServer(tools=TOOLS, name=\"test\", version=\"1.0.0\")\n\napp = Starlette()\napp.mount(\n \"/mcp\",\n mcp_server.app,\n)\n```\n\n## Server State Management\n\nThe server uses Starlette's lifespan system to manage shared state across the\nentire application lifecycle. State is initialized when the application starts\nand persists until it shuts down. Context is accessed through the\n`get_state_key` method on the `Arguments` object.\n\nExample:\n\n1. **Define a context class:**\n\n```python\nfrom dataclasses import dataclass, field\n\n# app/context.py\n\n@dataclass\nclass Context:\n called_tools: list[str] = field(default_factory=list)\n\n def get_called_tools(self) -> list[str]:\n return self.called_tools\n\n def add_called_tool(self, tool_name: str) -> None:\n self.called_tools.append(tool_name)\n```\n\n2. **Set up the application with lifespan:**\n\n```python\nimport contextlib\nfrom collections.abc import AsyncIterator\nfrom typing import TypedDict\nfrom starlette.applications import Starlette\nfrom app.context import Context\nfrom http_mcp.server import MCPServer\n\nclass State(TypedDict):\n context: Context\n\n@contextlib.asynccontextmanager\nasync def lifespan(_app: Starlette) -> AsyncIterator[State]:\n yield {\"context\": Context(called_tools=[])}\n\nmcp_server = MCPServer(\n tools=TOOLS,\n name=\"test\",\n version=\"1.0.0\",\n)\n\napp = Starlette(lifespan=lifespan)\napp.mount(\"/mcp\", mcp_server.app)\n```\n\n3. **Access the context in your tools:**\n\n```python\nfrom pydantic import BaseModel, Field\nfrom http_mcp.types import Arguments\nfrom app.context import Context\n\nclass MyToolArguments(BaseModel):\n question: str = Field(description=\"The question to answer\")\n\nclass MyToolOutput(BaseModel):\n answer: str = Field(description=\"The answer to the question\")\n\nasync def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput:\n # Access the context from lifespan state\n context = args.get_state_key(\"context\", Context)\n context.add_called_tool(\"my_tool\")\n ...\n\n return MyToolOutput(answer=f\"Hello, {args.inputs.question}!\")\n```\n\n## Request Access\n\nYou can access the incoming request object from your tools. The request object\nis passed to each tool call and can be used to access headers, cookies, and\nother request data (e.g. request.state, request.scope).\n\n```python\nfrom pydantic import BaseModel, Field\nfrom http_mcp.types import Arguments\n\nclass MyToolArguments(BaseModel):\n question: str = Field(description=\"The question to answer\")\n\nclass MyToolOutput(BaseModel):\n answer: str = Field(description=\"The answer to the question\")\n\n\nasync def my_tool(args: Arguments[MyToolArguments]) -> MyToolOutput:\n # Access the request\n auth_header = args.request.headers.get(\"Authorization\")\n ...\n\n return MyToolOutput(answer=f\"Hello, {args.inputs.question}!\")\n\n# Use MCPServer:\nfrom http_mcp.server import MCPServer\n\nmcp_server = MCPServer(\n name=\"my-server\",\n version=\"1.0.0\",\n tools=(my_tool,),\n)\n```\n\n## Prompts\n\nYou can add interactive templates that are invoked by user choice. Prompts now\nsupport lifespan state access, similar to tools.\n\n1. **Define the arguments for the prompts:**\n\n```python\nfrom pydantic import BaseModel, Field\n\nfrom http_mcp.mcp_types.content import TextContent\nfrom http_mcp.mcp_types.prompts import PromptMessage\nfrom http_mcp.types import Arguments, Prompt\n\n\nclass GetAdvice(BaseModel):\n topic: str = Field(description=\"The topic to get advice on\")\n include_actionable_steps: bool = Field(\n description=\"Whether to include actionable steps in the advice\", default=False\n )\n\n\ndef get_advice(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]:\n \"\"\"Get advice on a topic.\"\"\"\n template = \"\"\"\n You are a helpful assistant that can give advice on {topic}.\n \"\"\"\n if args.inputs.include_actionable_steps:\n template += \"\"\"\n The advice should include actionable steps.\n \"\"\"\n return (\n PromptMessage(\n role=\"user\",\n content=TextContent(\n text=template.format(topic=args.inputs.topic)\n ),\n ),\n )\n\n\nPROMPTS = (\n Prompt(\n func=get_advice,\n arguments_type=GetAdvice,\n ),\n)\n```\n\n2. **Using prompts with lifespan state:**\n\n```python\nfrom pydantic import BaseModel, Field\nfrom http_mcp.mcp_types.content import TextContent\nfrom http_mcp.mcp_types.prompts import PromptMessage\nfrom http_mcp.types import Arguments, Prompt\nfrom app.context import Context\n\nclass GetAdvice(BaseModel):\n topic: str = Field(description=\"The topic to get advice on\")\n\ndef get_advice_with_context(args: Arguments[GetAdvice]) -> tuple[PromptMessage, ...]:\n \"\"\"Get advice on a topic with context awareness.\"\"\"\n # Access the context from lifespan state\n context = args.get_state_key(\"context\", Context)\n called_tools = context.get_called_tools()\n template = \"\"\"\n You are a helpful assistant that can give advice on {topic}.\n Previously called tools: {tools}\n \"\"\"\n\n return (\n PromptMessage(\n role=\"user\",\n content=TextContent(\n text=template.format(\n topic=args.inputs.topic,\n tools=\", \".join(called_tools) if called_tools else \"none\"\n )\n )\n ),\n )\n\nPROMPTS_WITH_CONTEXT = (\n Prompt(\n func=get_advice_with_context,\n arguments_type=GetAdvice,\n ),\n)\n```\n\n3. **Instantiate the server:**\n\n```python\nfrom starlette.applications import Starlette\n\nfrom app.prompts import PROMPTS\nfrom http_mcp.server import MCPServer\n\napp = Starlette()\nmcp_server = MCPServer(tools=(), prompts=PROMPTS, name=\"test\", version=\"1.0.0\")\n\napp.mount(\n \"/mcp\",\n mcp_server.app,\n)\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "This is a HTTP implementation of the MCP protocol",
"version": "0.3.0",
"project_urls": {
"Homepage": "https://github.com/yeison-liscano/http_mcp",
"Issues": "https://github.com/yeison-liscano/http_mcp/issues"
},
"split_keywords": [
"automation",
" http",
" llm",
" mcp"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "b030bc099b86a67ba5c14ab87fc757f391475a21479930075c233c8cb0f78673",
"md5": "8a7ee7c2e9c82d7545e95bb80826ddec",
"sha256": "6ef8310f474d8936c75337a5f3dbdab6f0d71ceffcab26aee825b96b9c678b73"
},
"downloads": -1,
"filename": "http_mcp-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8a7ee7c2e9c82d7545e95bb80826ddec",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 18005,
"upload_time": "2025-08-24T16:27:34",
"upload_time_iso_8601": "2025-08-24T16:27:34.079295Z",
"url": "https://files.pythonhosted.org/packages/b0/30/bc099b86a67ba5c14ab87fc757f391475a21479930075c233c8cb0f78673/http_mcp-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a98303c33511f5dee31ec99095ea4f5c6bb27855ad04f4779de1f6faeb1bcba0",
"md5": "cd4468cde07240dbc949fc6afbfa8d20",
"sha256": "284543a0b80c92069749296b981849dad90341f7b6ce6872b2e3986535872656"
},
"downloads": -1,
"filename": "http_mcp-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "cd4468cde07240dbc949fc6afbfa8d20",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 474842,
"upload_time": "2025-08-24T16:27:35",
"upload_time_iso_8601": "2025-08-24T16:27:35.666586Z",
"url": "https://files.pythonhosted.org/packages/a9/83/03c33511f5dee31ec99095ea4f5c6bb27855ad04f4779de1f6faeb1bcba0/http_mcp-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-24 16:27:35",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yeison-liscano",
"github_project": "http_mcp",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "http-mcp"
}