ngff-zarr-mcp


Namengff-zarr-mcp JSON
Version 0.5.0 PyPI version JSON
download
home_pageNone
SummaryModel Context Protocol server for OME-Zarr image conversion
upload_time2025-07-23 16:33:05
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords image-conversion mcp microscopy ome-zarr
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ngff-zarr MCP Server

`ngff-zarr-mcp` is a Model Context Protocol (MCP) server that provides AI agents
with the ability to convert images to OME-Zarr format using the ngff-zarr
library.

[![asciicast](https://asciinema.org/a/726628.png)](https://asciinema.org/a/726628)

## Features

### Tools

- **convert_images_to_ome_zarr**: Convert various image formats to OME-Zarr with
  full control over metadata, compression, and multiscale generation
- **read_ome_zarr_store**: Read OME-Zarr data with support for remote storage
  options
- **get_ome_zarr_info**: Inspect existing OME-Zarr stores and get detailed
  information
- **validate_ome_zarr_store**: Validate OME-Zarr structure and metadata
- **optimize_ome_zarr_store**: Optimize existing stores with new compression and
  chunking

### Resources

- **supported-formats**: List of supported input/output formats and backends
- **downsampling-methods**: Available downsampling methods for multiscale
  generation
- **compression-codecs**: Available compression codecs and their characteristics

### Input Support

- Local files (all formats supported by ngff-zarr)
- Local directories (Zarr stores)
- Network URLs (HTTP/HTTPS)
- S3 URLs (with optional s3fs dependency)
- Remote storage with authentication (AWS S3, Google Cloud Storage, Azure)

### Advanced Features

- **RFC 4 - Anatomical Orientation**: Support for medical imaging orientation
  systems (LPS, RAS)
- **Method Metadata**: Enhanced multiscale metadata with downsampling method
  information
- **Storage Options**: Cloud storage authentication and configuration support
- **Multiscale Type Tracking**: Automatic detection and preservation of
  downsampling methods

### Output Optimization

- Multiple compression codecs (gzip, lz4, zstd, blosc variants)
- Configurable compression levels
- Flexible chunk sizing
- Sharding support (Zarr v3/OME-Zarr v0.5)
- OME-Zarr version selection (0.4 or 0.5)

## Installation

### Requirements

- Python >= 3.9
- Cursor, Windsurf, Claude Desktop, VS Code, or another MCP Client

### Quick Install

The easiest way to use ngff-zarr MCP server is with `uvx`:

```bash
# Install uvx if not already installed
pip install uvx

# Run the MCP server directly from PyPI
uvx ngff-zarr-mcp
```

<details>
<summary><b>Install in Cursor</b></summary>

Go to: `Settings` -> `Cursor Settings` -> `MCP` -> `Add new global MCP server`

Pasting the following configuration into your Cursor `~/.cursor/mcp.json` file
is the recommended approach. You may also install in a specific project by
creating `.cursor/mcp.json` in your project folder. See
[Cursor MCP docs](https://docs.cursor.com/context/model-context-protocol) for
more info.

#### Using uvx (recommended)

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

#### Using direct Python

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "python",
      "args": ["-m", "pip", "install", "ngff-zarr-mcp", "&&", "ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in Windsurf</b></summary>

Add this to your Windsurf MCP config file. See
[Windsurf MCP docs](https://docs.windsurf.com/windsurf/mcp) for more info.

#### Using uvx (recommended)

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

#### SSE Transport

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "url": "http://localhost:8000/sse",
      "description": "ngff-zarr server running with SSE transport"
    }
  }
}
```

</details>

<details>
<summary><b>Install in VS Code</b></summary>

Add this to your VS Code MCP config file. See
[VS Code MCP docs](https://code.visualstudio.com/docs/copilot/chat/mcp-servers)
for more info.

#### Using uvx (recommended)

```json
"mcp": {
  "servers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

#### Using pip install

```json
"mcp": {
  "servers": {
    "ngff-zarr": {
      "command": "python",
      "args": ["-c", "import subprocess; subprocess.run(['pip', 'install', 'ngff-zarr-mcp']); import ngff_zarr_mcp.server; ngff_zarr_mcp.server.main()"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in OpenCode</b></summary>

OpenCode is a Go-based CLI application that provides an AI-powered coding
assistant in the terminal. It supports MCP servers through JSON configuration
files. See [OpenCode MCP docs](https://opencode.ai/docs/mcp-servers/) for more
details.

Add this to your OpenCode configuration file (`~/.config/opencode/config.json`
for global or `opencode.json` for project-specific):

#### Using uvx (recommended)

```json
{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "ngff-zarr": {
      "type": "local",
      "command": ["uvx", "ngff-zarr-mcp"],
      "enabled": true
    }
  }
}
```

#### Using pip install

```json
{
  "mcp": {
    "ngff-zarr": {
      "type": "local",
      "command": [
        "python",
        "-c",
        "import subprocess; subprocess.run(['pip', 'install', 'ngff-zarr-mcp']); import ngff_zarr_mcp.server; ngff_zarr_mcp.server.main()"
      ],
      "enabled": true
    }
  }
}
```

After adding the configuration, restart OpenCode. The ngff-zarr tools will be
available in the terminal interface with automatic permission prompts for tool
execution.

</details>

<details>
<summary><b>Install in Claude Desktop</b></summary>

Add this to your Claude Desktop `claude_desktop_config.json` file. See
[Claude Desktop MCP docs](https://modelcontextprotocol.io/quickstart/user) for
more info.

#### Using uvx (recommended)

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

#### Using direct installation

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "ngff-zarr-mcp"
    }
  }
}
```

</details>

<details>
<summary><b>Install in Claude Code</b></summary>

Run this command. See
[Claude Code MCP docs](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/tutorials#set-up-model-context-protocol-mcp)
for more info.

#### Using uvx

```sh
claude mcp add ngff-zarr -- uvx ngff-zarr-mcp
```

#### Using pip

```sh
claude mcp add ngff-zarr -- python -m pip install ngff-zarr-mcp && ngff-zarr-mcp
```

</details>

<details>
<summary><b>Install in Gemini CLI</b></summary>

Add this to your _.gemini/settings.json_ Gemini CLI MCP configuration. See the
[Gemini CLI configuration docs](https://github.com/google-gemini/gemini-cli/blob/main/docs/cli/configuration.md)
for more info.

#### Using uvx (recommended)

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in Cline</b></summary>

1. Open **Cline**.
2. Click the hamburger menu icon (☰) to enter the **MCP Servers** section.
3. Add a new server with the following configuration:

#### Using uvx (recommended)

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in BoltAI</b></summary>

Open the "Settings" page of the app, navigate to "Plugins," and enter the
following JSON:

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

Once saved, you can start using ngff-zarr tools in your conversations. More
information is available on
[BoltAI's Documentation site](https://docs.boltai.com/docs/plugins/mcp-servers).

</details>

<details>
<summary><b>Install in Zed</b></summary>

Add this to your Zed `settings.json`. See
[Zed Context Server docs](https://zed.dev/docs/assistant/context-servers) for
more info.

```json
{
  "context_servers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in Augment Code</b></summary>

### **A. Using the Augment Code UI**

1. Click the hamburger menu.
2. Select **Settings**.
3. Navigate to the **Tools** section.
4. Click the **+ Add MCP** button.
5. Enter the following command: `uvx ngff-zarr-mcp`
6. Name the MCP: **ngff-zarr**.
7. Click the **Add** button.

### **B. Manual Configuration**

1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel
2. Select Edit Settings
3. Under Advanced, click Edit in settings.json
4. Add the server configuration:

```json
"augment.advanced": {
  "mcpServers": [
    {
      "name": "ngff-zarr",
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  ]
}
```

</details>

<details>
<summary><b>Install in JetBrains AI Assistant</b></summary>

See
[JetBrains AI Assistant Documentation](https://www.jetbrains.com/help/ai-assistant/configure-an-mcp-server.html)
for more details.

1. In JetBrains IDEs go to `Settings` -> `Tools` -> `AI Assistant` ->
   `Model Context Protocol (MCP)`
2. Click `+ Add`.
3. Click on `Command` in the top-left corner of the dialog and select the As
   JSON option from the list
4. Add this configuration and click `OK`

```json
{
  "command": "uvx",
  "args": ["ngff-zarr-mcp"]
}
```

5. Click `Apply` to save changes.

</details>

<details>
<summary><b>Install in Qodo Gen</b></summary>

See
[Qodo Gen docs](https://docs.qodo.ai/qodo-documentation/qodo-gen/qodo-gen-chat/agentic-mode/agentic-tools-mcps)
for more details.

1. Open Qodo Gen chat panel in VSCode or IntelliJ.
2. Click Connect more tools.
3. Click + Add new MCP.
4. Add the following configuration:

```json
{
  "command": "uvx",
  "args": ["ngff-zarr-mcp"]
}
```

</details>

<details>
<summary><b>Install in Roo Code</b></summary>

Add this to your Roo Code MCP configuration file. See
[Roo Code MCP docs](https://docs.roocode.com/features/mcp/using-mcp-in-roo) for
more info.

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in Amazon Q Developer CLI</b></summary>

Add this to your Amazon Q Developer CLI configuration file. See
[Amazon Q Developer CLI docs](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-mcp-configuration.html)
for more details.

```json
{
  "mcpServers": {
    "ngff-zarr": {
      "command": "uvx",
      "args": ["ngff-zarr-mcp"]
    }
  }
}
```

</details>

<details>
<summary><b>Install in Zencoder</b></summary>

To configure ngff-zarr MCP in Zencoder, follow these steps:

1. Go to the Zencoder menu (...)
2. From the dropdown menu, select Agent tools
3. Click on the Add custom MCP
4. Add the name and server configuration from below, and make sure to hit the
   Install button

```json
{
  "command": "uvx",
  "args": ["ngff-zarr-mcp"]
}
```

Once the MCP server is added, you can easily continue using it.

</details>

<details>
<summary><b>Install in Warp</b></summary>

See
[Warp Model Context Protocol Documentation](https://docs.warp.dev/knowledge-and-collaboration/mcp#adding-an-mcp-server)
for details.

1. Navigate `Settings` > `AI` > `Manage MCP servers`.
2. Add a new MCP server by clicking the `+ Add` button.
3. Paste the configuration given below:

```json
{
  "ngff-zarr": {
    "command": "uvx",
    "args": ["ngff-zarr-mcp"]
  }
}
```

4. Click `Save` to apply the changes.

</details>

<details>
<summary><b>Development Installation</b></summary>

For development work, use pixi (recommended) or pip:

#### Using pixi (Recommended)

```bash
# Install pixi if not already installed
curl -fsSL https://pixi.sh/install.sh | bash

# Clone and setup environment
git clone <repository>
cd mcp/
pixi install

# Development environment (includes all dev tools)
pixi shell -e dev

# Run development server
pixi run dev-server

# Run tests and checks
pixi run test
pixi run lint
pixi run typecheck
```

#### Using pip

```bash
# Clone and install in development mode
git clone <repository>
cd mcp/
pip install -e ".[all]"

# Run the server
ngff-zarr-mcp
```

</details>

## Usage

### As MCP Server

The server can be run in different transport modes:

```bash
# STDIO transport (default)
ngff-zarr-mcp

# Server-Sent Events transport
ngff-zarr-mcp --transport sse --host localhost --port 8000
```

### Transport Options

- **STDIO**: Default transport for most MCP clients
- **SSE**: Server-Sent Events for web-based clients or when HTTP transport is
  preferred

See the installation section above for client-specific configuration examples.

## Examples

### Convert a Single Image

```python
# Through MCP client, the agent can:
result = await convert_images_to_ome_zarr(
    input_paths=["image.tif"],
    output_path="output.ome.zarr",
    ome_zarr_version="0.4",
    scale_factors=[2, 4, 8],
    method="itkwasm_gaussian",
    compression_codec="zstd"
)
```

### Convert with Metadata

```python
result = await convert_images_to_ome_zarr(
    input_paths=["image.nii.gz"],
    output_path="brain.ome.zarr",
    dims=["z", "y", "x"],
    scale={"z": 2.0, "y": 0.5, "x": 0.5},
    units={"z": "micrometer", "y": "micrometer", "x": "micrometer"},
    name="Brain MRI",
    scale_factors=[2, 4]
)
```

### Optimize Existing Store

```python
result = await optimize_ome_zarr_store(
    input_path="large.ome.zarr",
    output_path="optimized.ome.zarr",
    compression_codec="blosc:zstd",
    chunks=[64, 64, 64]
)
```

### Get Store Information

```python
info = await get_ome_zarr_info("data.ome.zarr")
print(f"Size: {info.size_bytes} bytes")
print(f"Scales: {info.num_scales}")
print(f"Dimensions: {info.dimensions}")
```

## Supported Formats

### Input Formats

- **ITK/ITK-Wasm**: .nii, .nii.gz, .mha, .mhd, .nrrd, .dcm, .jpg, .png, .bmp,
  etc.
- **TIFF**: .tif, .tiff, .svs, .ndpi, .scn, etc. via tifffile
- **Video**: .webm, .mp4, .avi, .mov, .gif, etc. via imageio
- **Zarr**: .zarr, .ome.zarr

### Output Formats

- OME-Zarr (.ome.zarr, .zarr)

## Performance Options

### Memory Management

- Set memory targets to control RAM usage
- Use caching for large datasets
- Configure Dask LocalCluster for distributed processing

### Compression

- Choose from multiple codecs: gzip, lz4, zstd, blosc variants
- Adjust compression levels for speed vs. size tradeoffs
- Use sharding to reduce file count (Zarr v3)

### Chunking

- Optimize chunk sizes for your access patterns
- Configure sharding for better performance with cloud storage

## Development

### Using pixi (Recommended)

Pixi provides reproducible, cross-platform environment management. All Python
dependencies are defined in `pyproject.toml` and automatically managed by pixi.

```bash
# Clone and setup environment
git clone <repository>
cd mcp/
pixi install

# Development environment (includes all dev tools)
pixi shell -e dev

# Run tests
pixi run test
pixi run test-cov

# Lint and format code
pixi run lint
pixi run format
pixi run typecheck

# Run all checks
pixi run all-checks
```

#### Pixi Environments

- **default**: Runtime dependencies only (from `[project.dependencies]`)
- **dev**: Development tools (pytest, black, mypy, ruff)
- **cloud**: Cloud storage support (s3fs, gcsfs)
- **all**: Complete feature set (all ngff-zarr dependencies + cloud)

```bash
pixi shell -e dev      # Development work
pixi shell -e cloud    # Cloud storage testing
pixi shell -e all      # Full feature testing
```

### Using traditional tools

```bash
# Clone and install in development mode
git clone <repository>
cd mcp/
pip install -e ".[all]"

# Run tests
pytest

# Lint code
black .
ruff check .
```

## Dependencies

### Core

- mcp: Model Context Protocol implementation
- ngff-zarr: Core image conversion functionality
- pydantic: Data validation
- httpx: HTTP client for remote files
- aiofiles: Async file operations

### Optional

- s3fs: S3 storage support
- gcsfs: Google Cloud Storage support
- dask[distributed]: Distributed processing

## 🚨 Troubleshooting

<details>
<summary><b>Python Version Issues</b></summary>

The ngff-zarr-mcp server requires Python 3.9 or higher. If you encounter version
errors:

```bash
# Check your Python version
python --version

# Use uvx to automatically handle Python environments
uvx ngff-zarr-mcp
```

</details>

<details>
<summary><b>Package Not Found Errors</b></summary>

If you encounter package not found errors with uvx:

```bash
# Update uvx
pip install --upgrade uvx

# Try installing the package explicitly first
uvx install ngff-zarr-mcp
uvx ngff-zarr-mcp
```

</details>

<details>
<summary><b>Permission Issues</b></summary>

If you encounter permission errors during installation:

```bash
# Use user installation
pip install --user uvx

# Or create a virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install ngff-zarr-mcp
```

</details>

<details>
<summary><b>Memory Issues with Large Images</b></summary>

For large images, you may need to adjust memory settings:

```bash
# Start server with memory limit
ngff-zarr-mcp --memory-target 8GB

# Or use chunked processing in your conversion calls
# convert_images_to_ome_zarr(chunks=[512, 512, 64])
```

</details>

<details>
<summary><b>Network Issues with Remote Files</b></summary>

If you have issues accessing remote files:

```bash
# Test basic connectivity
curl -I <your-url>

# For S3 URLs, ensure s3fs is installed
pip install s3fs

# Configure AWS credentials if needed
aws configure
```

</details>

<details>
<summary><b>General MCP Client Errors</b></summary>

1. Ensure your MCP client supports the latest MCP protocol version
2. Check that the server starts correctly: `uvx ngff-zarr-mcp --help`
3. Verify JSON configuration syntax in your client config
4. Try restarting your MCP client after configuration changes
5. Check client logs for specific error messages

</details>

## License

MIT License - see LICENSE file for details.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "ngff-zarr-mcp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "image-conversion, mcp, microscopy, ome-zarr",
    "author": null,
    "author_email": "Matt McCormick <matt@fideus.io>",
    "download_url": "https://files.pythonhosted.org/packages/06/3a/a9ab16b5b58a3150fa63f137a9461b1db757c6087a99e1872f0fab3f6542/ngff_zarr_mcp-0.5.0.tar.gz",
    "platform": null,
    "description": "# ngff-zarr MCP Server\n\n`ngff-zarr-mcp` is a Model Context Protocol (MCP) server that provides AI agents\nwith the ability to convert images to OME-Zarr format using the ngff-zarr\nlibrary.\n\n[![asciicast](https://asciinema.org/a/726628.png)](https://asciinema.org/a/726628)\n\n## Features\n\n### Tools\n\n- **convert_images_to_ome_zarr**: Convert various image formats to OME-Zarr with\n  full control over metadata, compression, and multiscale generation\n- **read_ome_zarr_store**: Read OME-Zarr data with support for remote storage\n  options\n- **get_ome_zarr_info**: Inspect existing OME-Zarr stores and get detailed\n  information\n- **validate_ome_zarr_store**: Validate OME-Zarr structure and metadata\n- **optimize_ome_zarr_store**: Optimize existing stores with new compression and\n  chunking\n\n### Resources\n\n- **supported-formats**: List of supported input/output formats and backends\n- **downsampling-methods**: Available downsampling methods for multiscale\n  generation\n- **compression-codecs**: Available compression codecs and their characteristics\n\n### Input Support\n\n- Local files (all formats supported by ngff-zarr)\n- Local directories (Zarr stores)\n- Network URLs (HTTP/HTTPS)\n- S3 URLs (with optional s3fs dependency)\n- Remote storage with authentication (AWS S3, Google Cloud Storage, Azure)\n\n### Advanced Features\n\n- **RFC 4 - Anatomical Orientation**: Support for medical imaging orientation\n  systems (LPS, RAS)\n- **Method Metadata**: Enhanced multiscale metadata with downsampling method\n  information\n- **Storage Options**: Cloud storage authentication and configuration support\n- **Multiscale Type Tracking**: Automatic detection and preservation of\n  downsampling methods\n\n### Output Optimization\n\n- Multiple compression codecs (gzip, lz4, zstd, blosc variants)\n- Configurable compression levels\n- Flexible chunk sizing\n- Sharding support (Zarr v3/OME-Zarr v0.5)\n- OME-Zarr version selection (0.4 or 0.5)\n\n## Installation\n\n### Requirements\n\n- Python >= 3.9\n- Cursor, Windsurf, Claude Desktop, VS Code, or another MCP Client\n\n### Quick Install\n\nThe easiest way to use ngff-zarr MCP server is with `uvx`:\n\n```bash\n# Install uvx if not already installed\npip install uvx\n\n# Run the MCP server directly from PyPI\nuvx ngff-zarr-mcp\n```\n\n<details>\n<summary><b>Install in Cursor</b></summary>\n\nGo to: `Settings` -> `Cursor Settings` -> `MCP` -> `Add new global MCP server`\n\nPasting the following configuration into your Cursor `~/.cursor/mcp.json` file\nis the recommended approach. You may also install in a specific project by\ncreating `.cursor/mcp.json` in your project folder. See\n[Cursor MCP docs](https://docs.cursor.com/context/model-context-protocol) for\nmore info.\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n#### Using direct Python\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"python\",\n      \"args\": [\"-m\", \"pip\", \"install\", \"ngff-zarr-mcp\", \"&&\", \"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Windsurf</b></summary>\n\nAdd this to your Windsurf MCP config file. See\n[Windsurf MCP docs](https://docs.windsurf.com/windsurf/mcp) for more info.\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n#### SSE Transport\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"url\": \"http://localhost:8000/sse\",\n      \"description\": \"ngff-zarr server running with SSE transport\"\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in VS Code</b></summary>\n\nAdd this to your VS Code MCP config file. See\n[VS Code MCP docs](https://code.visualstudio.com/docs/copilot/chat/mcp-servers)\nfor more info.\n\n#### Using uvx (recommended)\n\n```json\n\"mcp\": {\n  \"servers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n#### Using pip install\n\n```json\n\"mcp\": {\n  \"servers\": {\n    \"ngff-zarr\": {\n      \"command\": \"python\",\n      \"args\": [\"-c\", \"import subprocess; subprocess.run(['pip', 'install', 'ngff-zarr-mcp']); import ngff_zarr_mcp.server; ngff_zarr_mcp.server.main()\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in OpenCode</b></summary>\n\nOpenCode is a Go-based CLI application that provides an AI-powered coding\nassistant in the terminal. It supports MCP servers through JSON configuration\nfiles. See [OpenCode MCP docs](https://opencode.ai/docs/mcp-servers/) for more\ndetails.\n\nAdd this to your OpenCode configuration file (`~/.config/opencode/config.json`\nfor global or `opencode.json` for project-specific):\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"$schema\": \"https://opencode.ai/config.json\",\n  \"mcp\": {\n    \"ngff-zarr\": {\n      \"type\": \"local\",\n      \"command\": [\"uvx\", \"ngff-zarr-mcp\"],\n      \"enabled\": true\n    }\n  }\n}\n```\n\n#### Using pip install\n\n```json\n{\n  \"mcp\": {\n    \"ngff-zarr\": {\n      \"type\": \"local\",\n      \"command\": [\n        \"python\",\n        \"-c\",\n        \"import subprocess; subprocess.run(['pip', 'install', 'ngff-zarr-mcp']); import ngff_zarr_mcp.server; ngff_zarr_mcp.server.main()\"\n      ],\n      \"enabled\": true\n    }\n  }\n}\n```\n\nAfter adding the configuration, restart OpenCode. The ngff-zarr tools will be\navailable in the terminal interface with automatic permission prompts for tool\nexecution.\n\n</details>\n\n<details>\n<summary><b>Install in Claude Desktop</b></summary>\n\nAdd this to your Claude Desktop `claude_desktop_config.json` file. See\n[Claude Desktop MCP docs](https://modelcontextprotocol.io/quickstart/user) for\nmore info.\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n#### Using direct installation\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"ngff-zarr-mcp\"\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Claude Code</b></summary>\n\nRun this command. See\n[Claude Code MCP docs](https://docs.anthropic.com/en/docs/agents-and-tools/claude-code/tutorials#set-up-model-context-protocol-mcp)\nfor more info.\n\n#### Using uvx\n\n```sh\nclaude mcp add ngff-zarr -- uvx ngff-zarr-mcp\n```\n\n#### Using pip\n\n```sh\nclaude mcp add ngff-zarr -- python -m pip install ngff-zarr-mcp && ngff-zarr-mcp\n```\n\n</details>\n\n<details>\n<summary><b>Install in Gemini CLI</b></summary>\n\nAdd this to your _.gemini/settings.json_ Gemini CLI MCP configuration. See the\n[Gemini CLI configuration docs](https://github.com/google-gemini/gemini-cli/blob/main/docs/cli/configuration.md)\nfor more info.\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Cline</b></summary>\n\n1. Open **Cline**.\n2. Click the hamburger menu icon (\u2630) to enter the **MCP Servers** section.\n3. Add a new server with the following configuration:\n\n#### Using uvx (recommended)\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in BoltAI</b></summary>\n\nOpen the \"Settings\" page of the app, navigate to \"Plugins,\" and enter the\nfollowing JSON:\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\nOnce saved, you can start using ngff-zarr tools in your conversations. More\ninformation is available on\n[BoltAI's Documentation site](https://docs.boltai.com/docs/plugins/mcp-servers).\n\n</details>\n\n<details>\n<summary><b>Install in Zed</b></summary>\n\nAdd this to your Zed `settings.json`. See\n[Zed Context Server docs](https://zed.dev/docs/assistant/context-servers) for\nmore info.\n\n```json\n{\n  \"context_servers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Augment Code</b></summary>\n\n### **A. Using the Augment Code UI**\n\n1. Click the hamburger menu.\n2. Select **Settings**.\n3. Navigate to the **Tools** section.\n4. Click the **+ Add MCP** button.\n5. Enter the following command: `uvx ngff-zarr-mcp`\n6. Name the MCP: **ngff-zarr**.\n7. Click the **Add** button.\n\n### **B. Manual Configuration**\n\n1. Press Cmd/Ctrl Shift P or go to the hamburger menu in the Augment panel\n2. Select Edit Settings\n3. Under Advanced, click Edit in settings.json\n4. Add the server configuration:\n\n```json\n\"augment.advanced\": {\n  \"mcpServers\": [\n    {\n      \"name\": \"ngff-zarr\",\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  ]\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in JetBrains AI Assistant</b></summary>\n\nSee\n[JetBrains AI Assistant Documentation](https://www.jetbrains.com/help/ai-assistant/configure-an-mcp-server.html)\nfor more details.\n\n1. In JetBrains IDEs go to `Settings` -> `Tools` -> `AI Assistant` ->\n   `Model Context Protocol (MCP)`\n2. Click `+ Add`.\n3. Click on `Command` in the top-left corner of the dialog and select the As\n   JSON option from the list\n4. Add this configuration and click `OK`\n\n```json\n{\n  \"command\": \"uvx\",\n  \"args\": [\"ngff-zarr-mcp\"]\n}\n```\n\n5. Click `Apply` to save changes.\n\n</details>\n\n<details>\n<summary><b>Install in Qodo Gen</b></summary>\n\nSee\n[Qodo Gen docs](https://docs.qodo.ai/qodo-documentation/qodo-gen/qodo-gen-chat/agentic-mode/agentic-tools-mcps)\nfor more details.\n\n1. Open Qodo Gen chat panel in VSCode or IntelliJ.\n2. Click Connect more tools.\n3. Click + Add new MCP.\n4. Add the following configuration:\n\n```json\n{\n  \"command\": \"uvx\",\n  \"args\": [\"ngff-zarr-mcp\"]\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Roo Code</b></summary>\n\nAdd this to your Roo Code MCP configuration file. See\n[Roo Code MCP docs](https://docs.roocode.com/features/mcp/using-mcp-in-roo) for\nmore info.\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Amazon Q Developer CLI</b></summary>\n\nAdd this to your Amazon Q Developer CLI configuration file. See\n[Amazon Q Developer CLI docs](https://docs.aws.amazon.com/amazonq/latest/qdeveloper-ug/command-line-mcp-configuration.html)\nfor more details.\n\n```json\n{\n  \"mcpServers\": {\n    \"ngff-zarr\": {\n      \"command\": \"uvx\",\n      \"args\": [\"ngff-zarr-mcp\"]\n    }\n  }\n}\n```\n\n</details>\n\n<details>\n<summary><b>Install in Zencoder</b></summary>\n\nTo configure ngff-zarr MCP in Zencoder, follow these steps:\n\n1. Go to the Zencoder menu (...)\n2. From the dropdown menu, select Agent tools\n3. Click on the Add custom MCP\n4. Add the name and server configuration from below, and make sure to hit the\n   Install button\n\n```json\n{\n  \"command\": \"uvx\",\n  \"args\": [\"ngff-zarr-mcp\"]\n}\n```\n\nOnce the MCP server is added, you can easily continue using it.\n\n</details>\n\n<details>\n<summary><b>Install in Warp</b></summary>\n\nSee\n[Warp Model Context Protocol Documentation](https://docs.warp.dev/knowledge-and-collaboration/mcp#adding-an-mcp-server)\nfor details.\n\n1. Navigate `Settings` > `AI` > `Manage MCP servers`.\n2. Add a new MCP server by clicking the `+ Add` button.\n3. Paste the configuration given below:\n\n```json\n{\n  \"ngff-zarr\": {\n    \"command\": \"uvx\",\n    \"args\": [\"ngff-zarr-mcp\"]\n  }\n}\n```\n\n4. Click `Save` to apply the changes.\n\n</details>\n\n<details>\n<summary><b>Development Installation</b></summary>\n\nFor development work, use pixi (recommended) or pip:\n\n#### Using pixi (Recommended)\n\n```bash\n# Install pixi if not already installed\ncurl -fsSL https://pixi.sh/install.sh | bash\n\n# Clone and setup environment\ngit clone <repository>\ncd mcp/\npixi install\n\n# Development environment (includes all dev tools)\npixi shell -e dev\n\n# Run development server\npixi run dev-server\n\n# Run tests and checks\npixi run test\npixi run lint\npixi run typecheck\n```\n\n#### Using pip\n\n```bash\n# Clone and install in development mode\ngit clone <repository>\ncd mcp/\npip install -e \".[all]\"\n\n# Run the server\nngff-zarr-mcp\n```\n\n</details>\n\n## Usage\n\n### As MCP Server\n\nThe server can be run in different transport modes:\n\n```bash\n# STDIO transport (default)\nngff-zarr-mcp\n\n# Server-Sent Events transport\nngff-zarr-mcp --transport sse --host localhost --port 8000\n```\n\n### Transport Options\n\n- **STDIO**: Default transport for most MCP clients\n- **SSE**: Server-Sent Events for web-based clients or when HTTP transport is\n  preferred\n\nSee the installation section above for client-specific configuration examples.\n\n## Examples\n\n### Convert a Single Image\n\n```python\n# Through MCP client, the agent can:\nresult = await convert_images_to_ome_zarr(\n    input_paths=[\"image.tif\"],\n    output_path=\"output.ome.zarr\",\n    ome_zarr_version=\"0.4\",\n    scale_factors=[2, 4, 8],\n    method=\"itkwasm_gaussian\",\n    compression_codec=\"zstd\"\n)\n```\n\n### Convert with Metadata\n\n```python\nresult = await convert_images_to_ome_zarr(\n    input_paths=[\"image.nii.gz\"],\n    output_path=\"brain.ome.zarr\",\n    dims=[\"z\", \"y\", \"x\"],\n    scale={\"z\": 2.0, \"y\": 0.5, \"x\": 0.5},\n    units={\"z\": \"micrometer\", \"y\": \"micrometer\", \"x\": \"micrometer\"},\n    name=\"Brain MRI\",\n    scale_factors=[2, 4]\n)\n```\n\n### Optimize Existing Store\n\n```python\nresult = await optimize_ome_zarr_store(\n    input_path=\"large.ome.zarr\",\n    output_path=\"optimized.ome.zarr\",\n    compression_codec=\"blosc:zstd\",\n    chunks=[64, 64, 64]\n)\n```\n\n### Get Store Information\n\n```python\ninfo = await get_ome_zarr_info(\"data.ome.zarr\")\nprint(f\"Size: {info.size_bytes} bytes\")\nprint(f\"Scales: {info.num_scales}\")\nprint(f\"Dimensions: {info.dimensions}\")\n```\n\n## Supported Formats\n\n### Input Formats\n\n- **ITK/ITK-Wasm**: .nii, .nii.gz, .mha, .mhd, .nrrd, .dcm, .jpg, .png, .bmp,\n  etc.\n- **TIFF**: .tif, .tiff, .svs, .ndpi, .scn, etc. via tifffile\n- **Video**: .webm, .mp4, .avi, .mov, .gif, etc. via imageio\n- **Zarr**: .zarr, .ome.zarr\n\n### Output Formats\n\n- OME-Zarr (.ome.zarr, .zarr)\n\n## Performance Options\n\n### Memory Management\n\n- Set memory targets to control RAM usage\n- Use caching for large datasets\n- Configure Dask LocalCluster for distributed processing\n\n### Compression\n\n- Choose from multiple codecs: gzip, lz4, zstd, blosc variants\n- Adjust compression levels for speed vs. size tradeoffs\n- Use sharding to reduce file count (Zarr v3)\n\n### Chunking\n\n- Optimize chunk sizes for your access patterns\n- Configure sharding for better performance with cloud storage\n\n## Development\n\n### Using pixi (Recommended)\n\nPixi provides reproducible, cross-platform environment management. All Python\ndependencies are defined in `pyproject.toml` and automatically managed by pixi.\n\n```bash\n# Clone and setup environment\ngit clone <repository>\ncd mcp/\npixi install\n\n# Development environment (includes all dev tools)\npixi shell -e dev\n\n# Run tests\npixi run test\npixi run test-cov\n\n# Lint and format code\npixi run lint\npixi run format\npixi run typecheck\n\n# Run all checks\npixi run all-checks\n```\n\n#### Pixi Environments\n\n- **default**: Runtime dependencies only (from `[project.dependencies]`)\n- **dev**: Development tools (pytest, black, mypy, ruff)\n- **cloud**: Cloud storage support (s3fs, gcsfs)\n- **all**: Complete feature set (all ngff-zarr dependencies + cloud)\n\n```bash\npixi shell -e dev      # Development work\npixi shell -e cloud    # Cloud storage testing\npixi shell -e all      # Full feature testing\n```\n\n### Using traditional tools\n\n```bash\n# Clone and install in development mode\ngit clone <repository>\ncd mcp/\npip install -e \".[all]\"\n\n# Run tests\npytest\n\n# Lint code\nblack .\nruff check .\n```\n\n## Dependencies\n\n### Core\n\n- mcp: Model Context Protocol implementation\n- ngff-zarr: Core image conversion functionality\n- pydantic: Data validation\n- httpx: HTTP client for remote files\n- aiofiles: Async file operations\n\n### Optional\n\n- s3fs: S3 storage support\n- gcsfs: Google Cloud Storage support\n- dask[distributed]: Distributed processing\n\n## \ud83d\udea8 Troubleshooting\n\n<details>\n<summary><b>Python Version Issues</b></summary>\n\nThe ngff-zarr-mcp server requires Python 3.9 or higher. If you encounter version\nerrors:\n\n```bash\n# Check your Python version\npython --version\n\n# Use uvx to automatically handle Python environments\nuvx ngff-zarr-mcp\n```\n\n</details>\n\n<details>\n<summary><b>Package Not Found Errors</b></summary>\n\nIf you encounter package not found errors with uvx:\n\n```bash\n# Update uvx\npip install --upgrade uvx\n\n# Try installing the package explicitly first\nuvx install ngff-zarr-mcp\nuvx ngff-zarr-mcp\n```\n\n</details>\n\n<details>\n<summary><b>Permission Issues</b></summary>\n\nIf you encounter permission errors during installation:\n\n```bash\n# Use user installation\npip install --user uvx\n\n# Or create a virtual environment\npython -m venv venv\nsource venv/bin/activate  # On Windows: venv\\Scripts\\activate\npip install ngff-zarr-mcp\n```\n\n</details>\n\n<details>\n<summary><b>Memory Issues with Large Images</b></summary>\n\nFor large images, you may need to adjust memory settings:\n\n```bash\n# Start server with memory limit\nngff-zarr-mcp --memory-target 8GB\n\n# Or use chunked processing in your conversion calls\n# convert_images_to_ome_zarr(chunks=[512, 512, 64])\n```\n\n</details>\n\n<details>\n<summary><b>Network Issues with Remote Files</b></summary>\n\nIf you have issues accessing remote files:\n\n```bash\n# Test basic connectivity\ncurl -I <your-url>\n\n# For S3 URLs, ensure s3fs is installed\npip install s3fs\n\n# Configure AWS credentials if needed\naws configure\n```\n\n</details>\n\n<details>\n<summary><b>General MCP Client Errors</b></summary>\n\n1. Ensure your MCP client supports the latest MCP protocol version\n2. Check that the server starts correctly: `uvx ngff-zarr-mcp --help`\n3. Verify JSON configuration syntax in your client config\n4. Try restarting your MCP client after configuration changes\n5. Check client logs for specific error messages\n\n</details>\n\n## License\n\nMIT License - see LICENSE file for details.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Model Context Protocol server for OME-Zarr image conversion",
    "version": "0.5.0",
    "project_urls": {
        "Documentation": "https://github.com/thewtex/ngff-zarr#readme",
        "Issues": "https://github.com/thewtex/ngff-zarr/issues",
        "Source": "https://github.com/thewtex/ngff-zarr"
    },
    "split_keywords": [
        "image-conversion",
        " mcp",
        " microscopy",
        " ome-zarr"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7f933a9551a723fb59034aea03ea6f2e2d1caf9bf980ef3007db951080389612",
                "md5": "8c1334d45c24c6b7b17be23f32a9f3af",
                "sha256": "8892394de912ef63aa647210491ce512a3513ca25375e88d410bd8cc7e990336"
            },
            "downloads": -1,
            "filename": "ngff_zarr_mcp-0.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8c1334d45c24c6b7b17be23f32a9f3af",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 21738,
            "upload_time": "2025-07-23T16:33:07",
            "upload_time_iso_8601": "2025-07-23T16:33:07.888033Z",
            "url": "https://files.pythonhosted.org/packages/7f/93/3a9551a723fb59034aea03ea6f2e2d1caf9bf980ef3007db951080389612/ngff_zarr_mcp-0.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "063aa9ab16b5b58a3150fa63f137a9461b1db757c6087a99e1872f0fab3f6542",
                "md5": "18cf9dd3dcecf17453ccfdb3b1f9f935",
                "sha256": "cc00f2b1f2910243d91a926a39697ad60f13c181760078821c9475284618122e"
            },
            "downloads": -1,
            "filename": "ngff_zarr_mcp-0.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "18cf9dd3dcecf17453ccfdb3b1f9f935",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 41165233,
            "upload_time": "2025-07-23T16:33:05",
            "upload_time_iso_8601": "2025-07-23T16:33:05.093192Z",
            "url": "https://files.pythonhosted.org/packages/06/3a/a9ab16b5b58a3150fa63f137a9461b1db757c6087a99e1872f0fab3f6542/ngff_zarr_mcp-0.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-23 16:33:05",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "thewtex",
    "github_project": "ngff-zarr#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ngff-zarr-mcp"
}
        
Elapsed time: 1.80197s