# Point Topic MCP Server
UK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.
## ✅ what's implemented
**database tools** (requires Snowflake credentials):
- `assemble_dataset_context()` - get schemas and examples for datasets (upc, upc_take_up, upc_forecast, tariffs, ontology)
- `execute_query()` - run safe read-only SQL queries
- `describe_table()` - get table schema details
- `get_la_code()` / `get_la_list_full()` - local authority lookups
**chart tools**:
- `get_point_topic_public_chart_catalog()` - browse public charts (no auth needed)
- `get_point_topic_public_chart_csv()` - get public chart data as CSV (no auth needed)
- `get_point_topic_chart_catalog()` - get complete catalog including private charts (requires API key)
- `get_point_topic_chart_csv()` - get any chart data as CSV with authentication (requires API key)
- `generate_authenticated_chart_url()` - create signed URLs for private charts (requires API key)
**server info**:
- `get_mcp_server_capabilities()` - check which tools are available and debug missing credentials
**conditional availability**: tools only appear if required environment variables are set
## installation (for end users)
**simple pip install**:
```bash
pip install point-topic-mcp
```
**add to your MCP client** (Claude Desktop, Cursor, etc.):
```json
{
"mcpServers": {
"point-topic": {
"command": "point-topic-mcp",
"env": {
"SNOWFLAKE_USER": "your_user",
"SNOWFLAKE_PASSWORD": "your_password",
"CHART_API_KEY": "your_chart_api_key"
}
}
}
}
```
**note**: environment variables are optional - tools will only appear if credentials are provided. use `get_mcp_server_capabilities()` to check which tools are available.
**Claude Desktop config location**:
- Mac: `~/Library/Application Support/Claude/claude_desktop_config.json`
- Windows: `%APPDATA%\Claude\claude_desktop_config.json`
## development setup
setup: `uv sync`
**for local development with claude desktop**:
This will add the server to your claude desktop config.
```bash
uv run mcp install src/point_topic_mcp/server_local.py --with "snowflake-connector-python[pandas]" -f .env
```
**for mcp inspector**:
```bash
uv run mcp dev src/point_topic_mcp/server_local.py
```
**environment configuration**:
create `.env` file with your credentials:
```bash
# Snowflake database credentials (for database tools)
SNOWFLAKE_USER=your_user
SNOWFLAKE_PASSWORD=your_password
# Chart API key (for authenticated chart generation)
CHART_API_KEY=your_chart_api_key
```
## architecture
**stdio transport**: communicates with MCP clients via standard input/output for local integration
**auto-discovery**: tools and datasets are automatically discovered from module files - no manual registration needed
**conditional tools**: tools only register if required environment variables are present - use `get_mcp_server_capabilities()` to debug
**modular design**:
- `src/point_topic_mcp/tools/` - tool modules auto-discovered and registered
- `src/point_topic_mcp/context/datasets/` - dataset modules auto-discovered for context assembly
## adding new tools
this project uses auto-discovery for tools - just add a function and it becomes available.
### tool structure
create a file in `src/point_topic_mcp/tools/` ending with `_tools.py`:
```python
# src/point_topic_mcp/tools/my_feature_tools.py
from typing import Optional
from mcp.server.fastmcp import Context
from mcp.server.session import ServerSession
def my_new_tool(param: str, ctx: Optional[Context[ServerSession, None]] = None) -> str:
"""Tool description visible to agents."""
# your implementation
return "result"
```
**that's it!** the tool is automatically discovered and registered.
### conditional tools (require credentials)
use `check_env_vars()` to conditionally define tools:
```python
from point_topic_mcp.core.utils import check_env_vars
from dotenv import load_dotenv
load_dotenv()
if check_env_vars('my_feature', ['MY_API_KEY']):
def authenticated_tool(ctx: Optional[Context[ServerSession, None]] = None) -> str:
"""Only available if MY_API_KEY is set."""
import os
api_key = os.getenv('MY_API_KEY')
# use api_key...
return "result"
```
### key principles
1. **auto-discovery**: any public function in `*_tools.py` files becomes a tool
2. **conditional registration**: wrap in `if check_env_vars()` for authenticated tools
3. **clear docstrings**: visible to agents at all times - keep concise and actionable
4. **type hints**: use for better agent understanding
## adding new datasets
this project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.
### dataset structure
each dataset is a python module in `src/point_topic_mcp/context/datasets/` with two required functions:
```python
def get_dataset_summary():
"""Brief description visible to agents at all times.
Keep concise - this goes in every agent prompt."""
return "short description of what data is available"
def get_db_info():
"""Complete context: schema, instructions, examples.
Only loaded when agent requests this dataset."""
return f"""
{DB_INFO}
{DB_SCHEMA}
{SQL_EXAMPLES}
"""
```
### key principles
1. **context window efficiency**: keep `get_dataset_summary()` extremely concise - it's always visible to agents
2. **lazy loading**: full context via `get_db_info()` only loads when needed
3. **self-contained**: each dataset module includes all its own schema, examples, and usage notes
4. **auto-discovery**: new `.py` files in the datasets directory are automatically available
### adding a new dataset
1. **create the module**: `src/point_topic_mcp/context/datasets/your_dataset.py`
2. **implement required functions**: `get_dataset_summary()` and `get_db_info()`
3. **test locally**: `uv run mcp dev src/point_topic_mcp/server_local.py`
4. **verify discovery**: agent should see your dataset in `assemble_dataset_context()` tool description
see existing modules (`upc.py`, `upc_take_up.py`, `upc_forecast.py`) for structure examples.
### optimization tips
- prioritize essential info in summaries
- use clear table descriptions that help agents choose the right dataset
- include common query patterns in examples
- sanity check data against known UK facts in instructions
## publishing to PyPI (for maintainers)
**build and test locally**:
```bash
# Build the package with UV (super fast!)
uv build
# Test installation locally
pip install dist/point_topic_mcp-*.whl
# Test the command works
point-topic-mcp
```
**publish to PyPI**:
```bash
# Set up PyPI credentials in ~/.pypirc first (one time setup)
# [pypi]
# username = __token__
# password = pypi-xxxxx...
# Publish to PyPI with the publish script
./publish_to_pypi.sh
```
**test installation from PyPI**:
```bash
pip install point-topic-mcp
point-topic-mcp
```
Raw data
{
"_id": null,
"home_page": null,
"name": "point-topic-mcp",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.12",
"maintainer_email": null,
"keywords": "broadband, data-analysis, mcp, snowflake",
"author": null,
"author_email": "Peter Donaghey <peter.donaghey@point-topic.com>",
"download_url": "https://files.pythonhosted.org/packages/73/37/825dfa11baff018b61bb3715d0888330a88fdeeea39b7cc24b8aebe51aca/point_topic_mcp-0.1.55.tar.gz",
"platform": null,
"description": "# Point Topic MCP Server\n\nUK broadband data analysis server via Model Context Protocol. Simple stdio-based server for local development and Claude Desktop integration.\n\n## \u2705 what's implemented\n\n**database tools** (requires Snowflake credentials):\n- `assemble_dataset_context()` - get schemas and examples for datasets (upc, upc_take_up, upc_forecast, tariffs, ontology)\n- `execute_query()` - run safe read-only SQL queries\n- `describe_table()` - get table schema details\n- `get_la_code()` / `get_la_list_full()` - local authority lookups\n\n**chart tools**:\n- `get_point_topic_public_chart_catalog()` - browse public charts (no auth needed)\n- `get_point_topic_public_chart_csv()` - get public chart data as CSV (no auth needed)\n- `get_point_topic_chart_catalog()` - get complete catalog including private charts (requires API key)\n- `get_point_topic_chart_csv()` - get any chart data as CSV with authentication (requires API key)\n- `generate_authenticated_chart_url()` - create signed URLs for private charts (requires API key)\n\n**server info**:\n- `get_mcp_server_capabilities()` - check which tools are available and debug missing credentials\n\n**conditional availability**: tools only appear if required environment variables are set\n\n## installation (for end users)\n\n**simple pip install**:\n\n```bash\npip install point-topic-mcp\n```\n\n**add to your MCP client** (Claude Desktop, Cursor, etc.):\n\n```json\n{\n \"mcpServers\": {\n \"point-topic\": {\n \"command\": \"point-topic-mcp\",\n \"env\": {\n \"SNOWFLAKE_USER\": \"your_user\", \n \"SNOWFLAKE_PASSWORD\": \"your_password\",\n \"CHART_API_KEY\": \"your_chart_api_key\"\n }\n }\n }\n}\n```\n\n**note**: environment variables are optional - tools will only appear if credentials are provided. use `get_mcp_server_capabilities()` to check which tools are available.\n\n**Claude Desktop config location**:\n- Mac: `~/Library/Application Support/Claude/claude_desktop_config.json`\n- Windows: `%APPDATA%\\Claude\\claude_desktop_config.json`\n\n## development setup\n\nsetup: `uv sync`\n\n**for local development with claude desktop**:\n\nThis will add the server to your claude desktop config.\n\n```bash\nuv run mcp install src/point_topic_mcp/server_local.py --with \"snowflake-connector-python[pandas]\" -f .env\n```\n\n**for mcp inspector**:\n\n```bash\nuv run mcp dev src/point_topic_mcp/server_local.py\n```\n\n**environment configuration**:\n\ncreate `.env` file with your credentials:\n\n```bash\n# Snowflake database credentials (for database tools)\nSNOWFLAKE_USER=your_user\nSNOWFLAKE_PASSWORD=your_password\n\n# Chart API key (for authenticated chart generation)\nCHART_API_KEY=your_chart_api_key\n```\n\n## architecture\n\n**stdio transport**: communicates with MCP clients via standard input/output for local integration\n\n**auto-discovery**: tools and datasets are automatically discovered from module files - no manual registration needed\n\n**conditional tools**: tools only register if required environment variables are present - use `get_mcp_server_capabilities()` to debug\n\n**modular design**:\n- `src/point_topic_mcp/tools/` - tool modules auto-discovered and registered\n- `src/point_topic_mcp/context/datasets/` - dataset modules auto-discovered for context assembly\n\n## adding new tools\n\nthis project uses auto-discovery for tools - just add a function and it becomes available.\n\n### tool structure\n\ncreate a file in `src/point_topic_mcp/tools/` ending with `_tools.py`:\n\n```python\n# src/point_topic_mcp/tools/my_feature_tools.py\n\nfrom typing import Optional\nfrom mcp.server.fastmcp import Context\nfrom mcp.server.session import ServerSession\n\ndef my_new_tool(param: str, ctx: Optional[Context[ServerSession, None]] = None) -> str:\n \"\"\"Tool description visible to agents.\"\"\"\n # your implementation\n return \"result\"\n```\n\n**that's it!** the tool is automatically discovered and registered.\n\n### conditional tools (require credentials)\n\nuse `check_env_vars()` to conditionally define tools:\n\n```python\nfrom point_topic_mcp.core.utils import check_env_vars\nfrom dotenv import load_dotenv\n\nload_dotenv()\n\nif check_env_vars('my_feature', ['MY_API_KEY']):\n def authenticated_tool(ctx: Optional[Context[ServerSession, None]] = None) -> str:\n \"\"\"Only available if MY_API_KEY is set.\"\"\"\n import os\n api_key = os.getenv('MY_API_KEY')\n # use api_key...\n return \"result\"\n```\n\n### key principles\n\n1. **auto-discovery**: any public function in `*_tools.py` files becomes a tool\n2. **conditional registration**: wrap in `if check_env_vars()` for authenticated tools\n3. **clear docstrings**: visible to agents at all times - keep concise and actionable\n4. **type hints**: use for better agent understanding\n\n## adding new datasets\n\nthis project uses a modular dataset system that allows easy addition of new data sources. each dataset is self-contained and automatically discovered by the MCP server.\n\n### dataset structure\n\neach dataset is a python module in `src/point_topic_mcp/context/datasets/` with two required functions:\n\n```python\ndef get_dataset_summary():\n \"\"\"Brief description visible to agents at all times.\n Keep concise - this goes in every agent prompt.\"\"\"\n return \"short description of what data is available\"\n\ndef get_db_info():\n \"\"\"Complete context: schema, instructions, examples.\n Only loaded when agent requests this dataset.\"\"\"\n return f\"\"\"\n {DB_INFO}\n \n {DB_SCHEMA}\n \n {SQL_EXAMPLES}\n \"\"\"\n```\n\n### key principles\n\n1. **context window efficiency**: keep `get_dataset_summary()` extremely concise - it's always visible to agents\n2. **lazy loading**: full context via `get_db_info()` only loads when needed\n3. **self-contained**: each dataset module includes all its own schema, examples, and usage notes\n4. **auto-discovery**: new `.py` files in the datasets directory are automatically available\n\n### adding a new dataset\n\n1. **create the module**: `src/point_topic_mcp/context/datasets/your_dataset.py`\n2. **implement required functions**: `get_dataset_summary()` and `get_db_info()`\n3. **test locally**: `uv run mcp dev src/point_topic_mcp/server_local.py`\n4. **verify discovery**: agent should see your dataset in `assemble_dataset_context()` tool description\n\nsee existing modules (`upc.py`, `upc_take_up.py`, `upc_forecast.py`) for structure examples.\n\n### optimization tips\n\n- prioritize essential info in summaries\n- use clear table descriptions that help agents choose the right dataset\n- include common query patterns in examples\n- sanity check data against known UK facts in instructions\n\n## publishing to PyPI (for maintainers)\n\n**build and test locally**:\n\n```bash\n# Build the package with UV (super fast!)\nuv build\n\n# Test installation locally\npip install dist/point_topic_mcp-*.whl\n\n# Test the command works\npoint-topic-mcp\n```\n\n**publish to PyPI**:\n\n```bash\n# Set up PyPI credentials in ~/.pypirc first (one time setup)\n# [pypi]\n# username = __token__\n# password = pypi-xxxxx...\n\n# Publish to PyPI with the publish script\n./publish_to_pypi.sh\n```\n\n**test installation from PyPI**:\n\n```bash\npip install point-topic-mcp\npoint-topic-mcp\n```",
"bugtrack_url": null,
"license": "MIT",
"summary": "UK broadband data analysis MCP server with Snowflake integration",
"version": "0.1.55",
"project_urls": {
"Homepage": "https://github.com/point-topic/point-topic-mcp",
"Repository": "https://github.com/point-topic/point-topic-mcp"
},
"split_keywords": [
"broadband",
" data-analysis",
" mcp",
" snowflake"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "a7ab561c980c46c028147a75f20dd8b980ba3b61aa1bb001feb9b2cb3128516b",
"md5": "78b5bd35a80f594dc0e3f4818a7530b1",
"sha256": "e7d9f54549f5b65c8e7bf3e21429fe93109c6d5f81c88e22385bdc0c63d3acb7"
},
"downloads": -1,
"filename": "point_topic_mcp-0.1.55-py3-none-any.whl",
"has_sig": false,
"md5_digest": "78b5bd35a80f594dc0e3f4818a7530b1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.12",
"size": 40736,
"upload_time": "2025-10-07T00:34:45",
"upload_time_iso_8601": "2025-10-07T00:34:45.327569Z",
"url": "https://files.pythonhosted.org/packages/a7/ab/561c980c46c028147a75f20dd8b980ba3b61aa1bb001feb9b2cb3128516b/point_topic_mcp-0.1.55-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7337825dfa11baff018b61bb3715d0888330a88fdeeea39b7cc24b8aebe51aca",
"md5": "1a319e863b52896b9949ba4181f97828",
"sha256": "0b95cea34366c162d230b87767f6ee8b1647a88968873969cb2105ae1435ebcd"
},
"downloads": -1,
"filename": "point_topic_mcp-0.1.55.tar.gz",
"has_sig": false,
"md5_digest": "1a319e863b52896b9949ba4181f97828",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.12",
"size": 156884,
"upload_time": "2025-10-07T00:34:46",
"upload_time_iso_8601": "2025-10-07T00:34:46.541508Z",
"url": "https://files.pythonhosted.org/packages/73/37/825dfa11baff018b61bb3715d0888330a88fdeeea39b7cc24b8aebe51aca/point_topic_mcp-0.1.55.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-07 00:34:46",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "point-topic",
"github_project": "point-topic-mcp",
"github_not_found": true,
"lcname": "point-topic-mcp"
}