mcp-ollama


Namemcp-ollama JSON
Version 0.1.3 PyPI version JSON
download
home_pageNone
SummaryMCP server for Ollama integration
upload_time2025-02-05 00:11:54
maintainerNone
docs_urlNone
authorNone
requires_python>=3.10
licenseMIT
keywords anthropic claude llm mcp ollama
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MCP Ollama

A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.

## Requirements

- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g., `ollama pull llama2`)

### Configure Claude Desktop

Add to your Claude Desktop configuration (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\Claude\claude_desktop_config.json` on Windows):

```json
{
  "mcpServers": {
    "ollama": {
      "command": "uvx",
      "args": [
        "mcp-ollama"
      ]
    }
  }
}
```

### Development

Install in development mode:
```bash
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
```

Test with MCP Inspector:
```bash
mcp dev src/mcp_ollama/server.py
```

## Features

The server provides four main tools:
- `list_models` - List all downloaded Ollama models
- `show_model` - Get detailed information about a specific model
- `ask_model` - Ask a question to a specified model

## License

MIT

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mcp-ollama",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "anthropic, claude, llm, mcp, ollama",
    "author": null,
    "author_email": "Matt Green <emgeee@users.noreply.github.com>",
    "download_url": "https://files.pythonhosted.org/packages/80/26/577af2c24a4beda8f8691683df6f313fdd5fc7280c37533f0d72987a159d/mcp_ollama-0.1.3.tar.gz",
    "platform": null,
    "description": "# MCP Ollama\n\nA Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.\n\n## Requirements\n\n- Python 3.10 or higher\n- Ollama installed and running (https://ollama.com/download)\n- At least one model pulled with Ollama (e.g., `ollama pull llama2`)\n\n### Configure Claude Desktop\n\nAdd to your Claude Desktop configuration (`~/Library/Application Support/Claude/claude_desktop_config.json` on macOS, `%APPDATA%\\Claude\\claude_desktop_config.json` on Windows):\n\n```json\n{\n  \"mcpServers\": {\n    \"ollama\": {\n      \"command\": \"uvx\",\n      \"args\": [\n        \"mcp-ollama\"\n      ]\n    }\n  }\n}\n```\n\n### Development\n\nInstall in development mode:\n```bash\ngit clone https://github.com/yourusername/mcp-ollama.git\ncd mcp-ollama\nuv sync\n```\n\nTest with MCP Inspector:\n```bash\nmcp dev src/mcp_ollama/server.py\n```\n\n## Features\n\nThe server provides four main tools:\n- `list_models` - List all downloaded Ollama models\n- `show_model` - Get detailed information about a specific model\n- `ask_model` - Ask a question to a specified model\n\n## License\n\nMIT\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "MCP server for Ollama integration",
    "version": "0.1.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/emgeee/mcp-ollama/issues",
        "Homepage": "https://github.com/emgeee/mcp-ollama"
    },
    "split_keywords": [
        "anthropic",
        " claude",
        " llm",
        " mcp",
        " ollama"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f854cdaacf2c878d2aa6c21b06318e6ee3307b767fe75695266d85c469f44072",
                "md5": "0052a4347e33bfe0fc3bd929f6b23dd4",
                "sha256": "57ec190b903a8ca78f3fd6e9fa77f56f3951741fa5fb0df29348ad40595f8ec6"
            },
            "downloads": -1,
            "filename": "mcp_ollama-0.1.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0052a4347e33bfe0fc3bd929f6b23dd4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 4463,
            "upload_time": "2025-02-05T00:11:53",
            "upload_time_iso_8601": "2025-02-05T00:11:53.067562Z",
            "url": "https://files.pythonhosted.org/packages/f8/54/cdaacf2c878d2aa6c21b06318e6ee3307b767fe75695266d85c469f44072/mcp_ollama-0.1.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8026577af2c24a4beda8f8691683df6f313fdd5fc7280c37533f0d72987a159d",
                "md5": "85735da1da7c3ddd19bbf7a4835641bb",
                "sha256": "9e3017047721cc43da7192118e35d9cae76ff828fe08a27fcac56be64a12145f"
            },
            "downloads": -1,
            "filename": "mcp_ollama-0.1.3.tar.gz",
            "has_sig": false,
            "md5_digest": "85735da1da7c3ddd19bbf7a4835641bb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 42880,
            "upload_time": "2025-02-05T00:11:54",
            "upload_time_iso_8601": "2025-02-05T00:11:54.947701Z",
            "url": "https://files.pythonhosted.org/packages/80/26/577af2c24a4beda8f8691683df6f313fdd5fc7280c37533f0d72987a159d/mcp_ollama-0.1.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-05 00:11:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "emgeee",
    "github_project": "mcp-ollama",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mcp-ollama"
}
        
Elapsed time: 0.71311s