Name | context-mcp-server JSON |
Version |
1.4.2
JSON |
| download |
home_page | None |
Summary | A Model Context Protocol server providing intelligent context management and web content fetching capabilities for AI assistants |
upload_time | 2025-07-20 03:27:00 |
maintainer | langgpt |
docs_url | None |
author | langgpt |
requires_python | >=3.10 |
license | MIT |
keywords |
automation
http
llm
mcp
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Context MCP Server
A Model Context Protocol (MCP) server that provides intelligent context management and web content fetching capabilities. This server enables AI assistants to efficiently store, retrieve, and manage contextual data while also fetching web content for real-time information access.
## Features
- 🔍 **Smart Content Fetching**: Retrieve web content using Jina Reader API with fallback mechanisms
- 🌐 **Web Content Processing**: Convert HTML to markdown for better AI consumption
- 💾 **File Management**: Save fetched content to organized file structures
- 🚀 **High Performance**: Optimized fetching algorithms with intelligent caching
- 🔧 **Easy Integration**: Standard MCP protocol compatibility with various AI clients
## Available Tools
### fetch
Fetches content from a URL and returns it as text. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.
**Arguments:**
- `url` (string, required): The URL to fetch content from
- `max_length` (integer, optional): Maximum number of characters to return (default: 5000)
- `start_index` (integer, optional): Start content from this character index (default: 0)
- `raw` (boolean, optional): Get raw content without markdown conversion (default: false)
**Returns:**
- The content of the URL as text
**Example usage:**
```
Please fetch the content from https://example.com
```
### fetch_and_save
Fetches content from a URL and saves it to a file. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.
**Arguments:**
- `url` (string, required): The URL to fetch content from
- `file_path` (string, optional): The path where to save the file. If not provided, a filename will be automatically generated based on the URL domain and timestamp
- `raw` (boolean, optional): Get raw content without markdown conversion (default: false)
**Returns:**
- The path where the file was saved
**Example usage:**
```
Please fetch and save the content from https://example.com to article.txt
```
Or with automatic naming:
```
Please fetch and save the content from https://example.com
```
## Available Prompts
- **fetch**
- Fetch a URL and extract its contents as markdown
- Arguments:
- `url` (string, required): URL to fetch
## Installation and Usage
### Local Development Setup
1. **Clone or download the source code:**
```bash
git clone https://github.com/LangGPT/context-mcp-server.git
cd context-mcp-server
```
2. **Install dependencies using uv:**
```bash
uv sync
```
3. **Test the server:**
```bash
uv run python -m context_mcp_server --help
```
### Using with Claude Desktop (Local Source)
Add this configuration to your Claude Desktop config file:
```json
{
"mcpServers": {
"context-mcp-server": {
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/your/context-mcp-server",
"python",
"-m",
"context_mcp_server"
],
"env": {
"CONTEXT_DIR": "/path/to/your/data/directory"
}
}
}
}
```
**Configuration file locations:**
- macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`
- Windows: `%APPDATA%/Claude/claude_desktop_config.json`
- Linux: `~/.config/Claude/claude_desktop_config.json`
### Using with VS Code (Local Source)
Add to your VS Code settings or `.vscode/mcp.json`:
```json
{
"mcpServers": {
"context-mcp-server": {
"command": "uv",
"args": [
"run",
"--directory",
"/path/to/your/context-mcp-server",
"python",
"-m",
"context_mcp_server"
],
"env": {
"CONTEXT_DIR": "/path/to/your/data/directory"
}
}
}
}
```
### Installation via Package Manager
#### Using uv (recommended)
When using [`uv`](https://docs.astral.sh/uv/) no specific installation is needed. We will
use [`uvx`](https://docs.astral.sh/uv/guides/tools/) to directly run *context-mcp-server*:
```bash
uvx context-mcp-server
```
#### Using pip
```bash
pip install context-mcp-server
```
After installation, run it as:
```bash
python -m context_mcp_server
```
### Package Manager Configuration
#### Claude Desktop with uvx
```json
{
"mcpServers": {
"context-mcp-server": {
"command": "uvx",
"args": ["context-mcp-server"],
"env": {
"CONTEXT_DIR": "/path/to/your/data/directory"
}
}
}
}
```
#### VS Code with uvx
```json
{
"mcp": {
"servers": {
"context-mcp-server": {
"command": "uvx",
"args": ["context-mcp-server"],
"env": {
"CONTEXT_DIR": "/path/to/your/data/directory"
}
}
}
}
}
```
## Configuration
### Environment Variables
#### CONTEXT_DIR
Sets the working directory where files will be saved when using the `fetch_and_save` tool.
- **Default**: `data`
- **Priority**: `CONTEXT_DIR` environment variable > default value `data`
**Example:**
```bash
export CONTEXT_DIR=/path/to/your/data
```
### Command Line Arguments
#### --user-agent
By default, depending on if the request came from the model (via a tool), or was user initiated (via a prompt), the server will use either the user-agent:
```
ModelContextProtocol/1.0 (Autonomous; +https://github.com/modelcontextprotocol/servers)
```
or:
```
ModelContextProtocol/1.0 (User-Specified; +https://github.com/modelcontextprotocol/servers)
```
This can be customized by adding the argument `--user-agent=YourUserAgent` to the `args` list in the configuration.
#### --proxy-url
The server can be configured to use a proxy by using the `--proxy-url` argument.
## Development
### Setting up Development Environment
1. **Install development dependencies:**
```bash
uv sync --dev
```
2. **Run linting and type checking:**
```bash
uv run ruff check
uv run pyright
```
3. **Build the package:**
```bash
uv build
```
### Testing
Test the server locally:
```bash
uv run python -m context_mcp_server
```
With custom work directory:
```bash
CONTEXT_DIR=/custom/path uv run python -m context_mcp_server
```
Use the MCP inspector for debugging:
```bash
npx @modelcontextprotocol/inspector uv run python -m context_mcp_server
```
With custom work directory:
```bash
CONTEXT_DIR=/custom/path npx @modelcontextprotocol/inspector uv run python -m context_mcp_server
```
### Making Changes
1. Edit the source code in `src/context_mcp_server/`
2. Test your changes with `uv run python -m context_mcp_server`
3. Update version in `pyproject.toml` if needed
4. Run tests and linting
## Debugging
You can use the MCP inspector to debug the server:
For local development:
```bash
npx @modelcontextprotocol/inspector uv run python -m context_mcp_server
```
For uvx installations:
```bash
npx @modelcontextprotocol/inspector uvx context-mcp-server
```
## Contributing
We encourage contributions to help expand and improve context-mcp-server. Whether you want to add new tools, enhance existing functionality, or improve documentation, your input is valuable.
## License
context-mcp-server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.
Raw data
{
"_id": null,
"home_page": null,
"name": "context-mcp-server",
"maintainer": "langgpt",
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "automation, http, llm, mcp",
"author": "langgpt",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/b8/54/1f66baa751c11c01ffddd6a0a66b9d4e1ebcabb1dda4a91580b2198b62fd/context_mcp_server-1.4.2.tar.gz",
"platform": null,
"description": "# Context MCP Server\n\nA Model Context Protocol (MCP) server that provides intelligent context management and web content fetching capabilities. This server enables AI assistants to efficiently store, retrieve, and manage contextual data while also fetching web content for real-time information access.\n\n## Features\n\n- \ud83d\udd0d **Smart Content Fetching**: Retrieve web content using Jina Reader API with fallback mechanisms\n- \ud83c\udf10 **Web Content Processing**: Convert HTML to markdown for better AI consumption\n- \ud83d\udcbe **File Management**: Save fetched content to organized file structures\n- \ud83d\ude80 **High Performance**: Optimized fetching algorithms with intelligent caching\n- \ud83d\udd27 **Easy Integration**: Standard MCP protocol compatibility with various AI clients\n\n## Available Tools\n\n### fetch\n\nFetches content from a URL and returns it as text. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.\n\n**Arguments:**\n- `url` (string, required): The URL to fetch content from\n- `max_length` (integer, optional): Maximum number of characters to return (default: 5000)\n- `start_index` (integer, optional): Start content from this character index (default: 0)\n- `raw` (boolean, optional): Get raw content without markdown conversion (default: false)\n\n**Returns:**\n- The content of the URL as text\n\n**Example usage:**\n```\nPlease fetch the content from https://example.com\n```\n\n### fetch_and_save\n\nFetches content from a URL and saves it to a file. This tool attempts to get content using the Jina Reader API first, and falls back to direct HTTP request if that fails.\n\n**Arguments:**\n- `url` (string, required): The URL to fetch content from\n- `file_path` (string, optional): The path where to save the file. If not provided, a filename will be automatically generated based on the URL domain and timestamp\n- `raw` (boolean, optional): Get raw content without markdown conversion (default: false)\n\n**Returns:**\n- The path where the file was saved\n\n**Example usage:**\n```\nPlease fetch and save the content from https://example.com to article.txt\n```\n\nOr with automatic naming:\n```\nPlease fetch and save the content from https://example.com\n```\n\n## Available Prompts\n\n- **fetch**\n - Fetch a URL and extract its contents as markdown\n - Arguments:\n - `url` (string, required): URL to fetch\n\n## Installation and Usage\n\n### Local Development Setup\n\n1. **Clone or download the source code:**\n ```bash\n git clone https://github.com/LangGPT/context-mcp-server.git\n cd context-mcp-server\n ```\n\n2. **Install dependencies using uv:**\n ```bash\n uv sync\n ```\n\n3. **Test the server:**\n ```bash\n uv run python -m context_mcp_server --help\n ```\n\n### Using with Claude Desktop (Local Source)\n\nAdd this configuration to your Claude Desktop config file:\n\n```json\n{\n \"mcpServers\": {\n \"context-mcp-server\": {\n \"command\": \"uv\",\n \"args\": [\n \"run\",\n \"--directory\",\n \"/path/to/your/context-mcp-server\",\n \"python\",\n \"-m\",\n \"context_mcp_server\"\n ],\n \"env\": {\n \"CONTEXT_DIR\": \"/path/to/your/data/directory\"\n }\n }\n }\n}\n```\n\n**Configuration file locations:**\n- macOS: `~/Library/Application Support/Claude/claude_desktop_config.json`\n- Windows: `%APPDATA%/Claude/claude_desktop_config.json`\n- Linux: `~/.config/Claude/claude_desktop_config.json`\n\n### Using with VS Code (Local Source)\n\nAdd to your VS Code settings or `.vscode/mcp.json`:\n\n```json\n{\n \"mcpServers\": {\n \"context-mcp-server\": {\n \"command\": \"uv\",\n \"args\": [\n \"run\",\n \"--directory\",\n \"/path/to/your/context-mcp-server\",\n \"python\",\n \"-m\",\n \"context_mcp_server\"\n ],\n \"env\": {\n \"CONTEXT_DIR\": \"/path/to/your/data/directory\"\n }\n }\n }\n}\n```\n\n### Installation via Package Manager\n\n#### Using uv (recommended)\n\nWhen using [`uv`](https://docs.astral.sh/uv/) no specific installation is needed. We will\nuse [`uvx`](https://docs.astral.sh/uv/guides/tools/) to directly run *context-mcp-server*:\n\n```bash\nuvx context-mcp-server\n```\n\n#### Using pip\n\n```bash\npip install context-mcp-server\n```\n\nAfter installation, run it as:\n```bash\npython -m context_mcp_server\n```\n\n### Package Manager Configuration\n\n#### Claude Desktop with uvx\n\n```json\n{\n \"mcpServers\": {\n \"context-mcp-server\": {\n \"command\": \"uvx\",\n \"args\": [\"context-mcp-server\"],\n \"env\": {\n \"CONTEXT_DIR\": \"/path/to/your/data/directory\"\n }\n }\n }\n}\n```\n\n#### VS Code with uvx\n\n```json\n{\n \"mcp\": {\n \"servers\": {\n \"context-mcp-server\": {\n \"command\": \"uvx\",\n \"args\": [\"context-mcp-server\"],\n \"env\": {\n \"CONTEXT_DIR\": \"/path/to/your/data/directory\"\n }\n }\n }\n }\n}\n```\n\n## Configuration\n\n### Environment Variables\n\n#### CONTEXT_DIR\n\nSets the working directory where files will be saved when using the `fetch_and_save` tool.\n\n- **Default**: `data`\n- **Priority**: `CONTEXT_DIR` environment variable > default value `data`\n\n**Example:**\n```bash\nexport CONTEXT_DIR=/path/to/your/data\n```\n\n### Command Line Arguments\n\n#### --user-agent\n\nBy default, depending on if the request came from the model (via a tool), or was user initiated (via a prompt), the server will use either the user-agent:\n```\nModelContextProtocol/1.0 (Autonomous; +https://github.com/modelcontextprotocol/servers)\n```\nor:\n```\nModelContextProtocol/1.0 (User-Specified; +https://github.com/modelcontextprotocol/servers)\n```\n\nThis can be customized by adding the argument `--user-agent=YourUserAgent` to the `args` list in the configuration.\n\n#### --proxy-url\n\nThe server can be configured to use a proxy by using the `--proxy-url` argument.\n\n## Development\n\n### Setting up Development Environment\n\n1. **Install development dependencies:**\n ```bash\n uv sync --dev\n ```\n\n2. **Run linting and type checking:**\n ```bash\n uv run ruff check\n uv run pyright\n ```\n\n3. **Build the package:**\n ```bash\n uv build\n ```\n\n### Testing\n\nTest the server locally:\n```bash\nuv run python -m context_mcp_server\n```\n\nWith custom work directory:\n```bash\nCONTEXT_DIR=/custom/path uv run python -m context_mcp_server\n```\n\nUse the MCP inspector for debugging:\n```bash\nnpx @modelcontextprotocol/inspector uv run python -m context_mcp_server\n```\n\nWith custom work directory:\n```bash\nCONTEXT_DIR=/custom/path npx @modelcontextprotocol/inspector uv run python -m context_mcp_server\n```\n\n### Making Changes\n\n1. Edit the source code in `src/context_mcp_server/`\n2. Test your changes with `uv run python -m context_mcp_server`\n3. Update version in `pyproject.toml` if needed\n4. Run tests and linting\n\n## Debugging\n\nYou can use the MCP inspector to debug the server:\n\nFor local development:\n```bash\nnpx @modelcontextprotocol/inspector uv run python -m context_mcp_server\n```\n\nFor uvx installations:\n```bash\nnpx @modelcontextprotocol/inspector uvx context-mcp-server\n```\n\n## Contributing\n\nWe encourage contributions to help expand and improve context-mcp-server. Whether you want to add new tools, enhance existing functionality, or improve documentation, your input is valuable.\n\n## License\n\ncontext-mcp-server is licensed under the MIT License. This means you are free to use, modify, and distribute the software, subject to the terms and conditions of the MIT License. For more details, please see the LICENSE file in the project repository.",
"bugtrack_url": null,
"license": "MIT",
"summary": "A Model Context Protocol server providing intelligent context management and web content fetching capabilities for AI assistants",
"version": "1.4.2",
"project_urls": null,
"split_keywords": [
"automation",
" http",
" llm",
" mcp"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "864ad879f080de3bbf50b3ea0d1640e96ae8b7b6162e33a2028a567673e408ae",
"md5": "361147e205e43ada0479b81997c402f7",
"sha256": "b4901e6b967ceb9087f995b25a7d6d8dff71ba332a13e41977915d925a036160"
},
"downloads": -1,
"filename": "context_mcp_server-1.4.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "361147e205e43ada0479b81997c402f7",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 9739,
"upload_time": "2025-07-20T03:26:59",
"upload_time_iso_8601": "2025-07-20T03:26:59.095122Z",
"url": "https://files.pythonhosted.org/packages/86/4a/d879f080de3bbf50b3ea0d1640e96ae8b7b6162e33a2028a567673e408ae/context_mcp_server-1.4.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b8541f66baa751c11c01ffddd6a0a66b9d4e1ebcabb1dda4a91580b2198b62fd",
"md5": "bf65ea4020550a5fc9049ccf8d30b003",
"sha256": "56c5374c952d1af9a107f591d4492403363da5ae98e8667a76f72ec5ffeb0110"
},
"downloads": -1,
"filename": "context_mcp_server-1.4.2.tar.gz",
"has_sig": false,
"md5_digest": "bf65ea4020550a5fc9049ccf8d30b003",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 53541,
"upload_time": "2025-07-20T03:27:00",
"upload_time_iso_8601": "2025-07-20T03:27:00.601950Z",
"url": "https://files.pythonhosted.org/packages/b8/54/1f66baa751c11c01ffddd6a0a66b9d4e1ebcabb1dda4a91580b2198b62fd/context_mcp_server-1.4.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-20 03:27:00",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "context-mcp-server"
}