mcp-chat


Namemcp-chat JSON
Version 0.2.13 PyPI version JSON
download
home_pageNone
SummarySimple MCP Client to quickly test and explore MCP servers from the command line
upload_time2025-08-14 06:24:19
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseNone
keywords cli client explore langchain mcp model-context-protocol python quick simple test tools try
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Simple MCP Client to Explore MCP Servers [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/mcp-chat.svg)](https://pypi.org/project/mcp-chat/)


**Quickly test and explore MCP servers from the command line!**

A simple, text-based CLI client for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers built with LangChain and Python.  
Suitable for testing MCP servers, exploring their capabilities, and prototyping integrations.

Internally it uses [LangChain ReAct Agent](https://langchain-ai.github.io/langgraph/reference/agents/) and
a utility function `convert_mcp_to_langchain_tools()` from [`langchain_mcp_tools`](https://pypi.org/project/langchain-mcp-tools/).  

A TypeScript equivalent of this utility is available [here](https://www.npmjs.com/package/@h1deya/mcp-try-cli)

## Prerequisites

- Python 3.11+
- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/)
  installed to run Python package-based MCP servers
- [optional] [npm 7+ (`npx`)](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)
  to run Node.js package-based MCP servers
- LLM API key(s) from
  [OpenAI](https://platform.openai.com/api-keys),
  [Anthropic](https://console.anthropic.com/settings/keys),
  [Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey),
  [xAI](https://console.x.ai/),
  [Cerebras](https://cloud.cerebras.ai),
  and/or
  [Groq](https://console.groq.com/keys),
  as needed

## Quick Start

- Install `mcp-chat` tool.
  This can take up to a few minutes to complete:
  ```bash
  pip install mcp-chat
  ```

- Configure LLM and MCP Servers settings via the configuration file, `llm_mcp_config.json5`
  ```bash
  code llm_mcp_config.json5
  ```

  The following is a simple configuration for quick testing:
  ```json5
  {
    "llm": {
      "provider": "openai", "model": "gpt-5-mini",
      // "provider": "anthropic",    "model": "claude-3-5-haiku-latest",
      // "provider": "google_genai", "model": "gemini-2.5-flash",
      // "provider": "xai",          "model": "grok-3-mini",
      // "provider": "cerebras",     "model": "gpt-oss-120b",
      // "provider": "grok",         "model": "openai/gpt-oss-20b",
    },

    "mcp_servers": {
      "us-weather": {  // US weather only
        "command": "npx", 
        "args": ["-y", "@h1deya/mcp-server-weather"]
      },
    },

    "example_queries": [
      "Tell me how LLMs work in a few sentences",
      "Are there any weather alerts in California?",
    ],
  }
  ```

- Set up API keys
  ```bash
  echo "ANTHROPIC_API_KEY=sk-ant-...                                       
  OPENAI_API_KEY=sk-proj-...
  GOOGLE_API_KEY=AI...
  XAI_API_KEY=xai-...
  CEREBRAS_API_KEY=csk-...
  GROQ_API_KEY=gsk_..." > .env
  
  code .env
  ```

- Run the tool
  ```bash
  mcp-chat
  ```
  By default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory.  
  Then, it applies the environment variables specified in the `.env` file,
  as well as the ones that are already defined.

## Building from Source

See [README_DEV.md](https://github.com/hideya/mcp-client-langchain-py/blob/main/README_DEV.md) for details.

## Features

- **Easy setup**: Works out of the box with popular MCP servers
- **Flexible configuration**: JSON5 config with environment variable support
- **Multiple LLM/API providers**: OpenAI, Anthropic, Google (GenAI), xAI, Ceberas, Groq
- **Command & URL servers**: Support for both local and remote MCP servers
- **Local MCP Server logging**: Save stdio MCP server logs with customizable log directory
- **Interactive testing**: Example queries for the convenience of repeated testing

## Limitations

- **Tool Return Types**: Currently, only text results of tool calls are supported.
It uses LangChain's `response_format: 'content'` (the default) internally, which only supports text strings.
While MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content.
- **MCP Features**: Only MCP [Tools](https://modelcontextprotocol.io/docs/concepts/tools) are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented.

## Usage

### Basic Usage

```bash
mcp-chat
```

By default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory.  
Then, it applies the environment variables specified in the `.env` file,
as well as the ones that are already defined.  
It outputs local MCP server logs to the current directory.

### With Options

```bash
# Specify the config file to use
mcp-chat --config my-config.json5

# Store local (stdio) MCP server logs in specific directory
mcp-chat --log-dir ./logs

# Enable verbose logging
mcp-chat --verbose

# Show help
mcp-chat --help
```

## Supported Model/API Providers

- **OpenAI**: `gpt-4o`, `gpt-4o-mini`, etc.
- **Anthropic**: `claude-sonnet-4-0`, `claude-3-5-haiku-latest`, etc.
- **Google (GenAI)**: `gemini-2.0-flash`, `gemini-1.5-pro`, etc.
- **xAI**: `grok-3-mini`, `grok-4`, etc.
- **Cerebras**: `gpt-oss-120b`, etc.
- **Groq**: `openai/gpt-oss-20b`, `openai/gpt-oss-120b`, etc.

## Configuration

Create a `llm_mcp_config.json5` file:

- [The configuration file format](https://github.com/hideya/mcp-client-langchain-py/blob/main/llm_mcp_config.json5)
  for MCP servers follows the same structure as
  [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),
  with one difference: the key name `mcpServers` has been changed
  to `mcp_servers` to follow the snake_case convention
  commonly used in JSON configuration files.
- The file format is [JSON5](https://json5.org/),
  where comments and trailing commas are allowed.
- The format is further extended to replace `${...}` notations
  with the values of corresponding environment variables.
- Keep all the credentials and private info in the `.env` file
  and refer to them with `${...}` notation as needed

```json5
{
  "llm": {
    "provider": "openai",
    "model": "gpt-4.1-nano",
    // model: "gpt-5-mini",
  },

  // "llm": {
  //   "provider": "anthropic",
  //   "model": "claude-3-5-haiku-latest",
  //   // "model": "claude-sonnet-4-0",
  // },

  // "llm": {
  //   "provider": "google_genai",
  //   "model": "gemini-2.5-flash",
  //   // "model": "gemini-2.5-pro",
  // },

  // "llm": {
  //   "provider": "xai",
  //   "model": "grok-3-mini",
  //   // "model": "grok-4",
  // },

  // "llm": {
  //   "provider": "cerebras",
  //   "model": "gpt-oss-120b",
  // },

  // "llm": {
  //   "provider": "groq",
  //   "model": "openai/gpt-oss-20b",
  //   // "model": "openai/gpt-oss-120b",
  // },

  "example_queries": [
    "Tell me how LLMs work in a few sentences",
    "Are there any weather alerts in California?",
    "Read the news headlines on bbc.com",
  ],

  "mcp_servers": {
    // Local MCP server that uses `npx`
    "weather": {
      "command": "npx", 
      "args": [ "-y", "@h1deya/mcp-server-weather" ]
    },

    // Another local server that uses `uvx`
    "fetch": {
      "command": "uvx",
      "args": [ "mcp-server-fetch" ]
    },

    "brave-search": {
      "command": "npx",
      "args": [ "-y", "@modelcontextprotocol/server-brave-search" ],
      "env": { "BRAVE_API_KEY": "${BRAVE_API_KEY}" }
    },

    // Remote MCP server via URL
    // Auto-detection: tries Streamable HTTP first, falls back to SSE
    "remote-mcp-server": {
      "url": "https://api.example.com/..."
    },

    // Server with authentication
    "github": {
      "type": "http",  // recommended to specify the protocol explicitly when authentication is used
      "url": "https://api.githubcopilot.com/mcp/",
      "headers": {
        "Authorization": "Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}"
      }
    }
  }
}
```

### Environment Variables

Create a `.env` file for API keys:

```bash
OPENAI_API_KEY=sk-ant-...
ANTHROPIC_API_KEY=sk-proj-...
GOOGLE_API_KEY=AI...
XAI_API_KEY=xai-...
CEREBRAS_API_KEY=csk-...
GROQ_API_KEY=gsk_...

# Other services as needed
GITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...
BRAVE_API_KEY=BSA...
```

## Popular MCP Servers to Try

There are quite a few useful MCP servers already available:

- [MCP Server Listing on the Official Site](https://github.com/modelcontextprotocol/servers?tab=readme-ov-file#model-context-protocol-servers)

## Troubleshooting

- Make sure your configuration and .env files are correct, especially the spelling of the API keys
- Check the local MCP server logs
- Use `--verbose` flag to view the detailed logs
- Refer to [Debugging Section in MCP documentation](https://modelcontextprotocol.io/docs/tools/debugging)

## Change Log

Can be found [here](https://github.com/hideya/mcp-client-langchain-py/blob/main/CHANGELOG.md)

## License

MIT License - see [LICENSE](https://github.com/hideya/mcp-client-langchain-py/blob/main/LICENSE) file for details.

## Contributing

Issues and pull requests welcome! This tool aims to make MCP server testing as simple as possible.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mcp-chat",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "cli, client, explore, langchain, mcp, model-context-protocol, python, quick, simple, test, tools, try",
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/f9/e1/74d51138a690ffc4e43f750a875dd655d04bf52e6a7613baef701c80f50b/mcp_chat-0.2.13.tar.gz",
    "platform": null,
    "description": "# Simple MCP Client to Explore MCP Servers [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://github.com/hideya/langchain-mcp-tools-py/blob/main/LICENSE) [![pypi version](https://img.shields.io/pypi/v/mcp-chat.svg)](https://pypi.org/project/mcp-chat/)\n\n\n**Quickly test and explore MCP servers from the command line!**\n\nA simple, text-based CLI client for [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers built with LangChain and Python.  \nSuitable for testing MCP servers, exploring their capabilities, and prototyping integrations.\n\nInternally it uses [LangChain ReAct Agent](https://langchain-ai.github.io/langgraph/reference/agents/) and\na utility function `convert_mcp_to_langchain_tools()` from [`langchain_mcp_tools`](https://pypi.org/project/langchain-mcp-tools/).  \n\nA TypeScript equivalent of this utility is available [here](https://www.npmjs.com/package/@h1deya/mcp-try-cli)\n\n## Prerequisites\n\n- Python 3.11+\n- [optional] [`uv` (`uvx`)](https://docs.astral.sh/uv/getting-started/installation/)\n  installed to run Python package-based MCP servers\n- [optional] [npm 7+ (`npx`)](https://docs.npmjs.com/downloading-and-installing-node-js-and-npm)\n  to run Node.js package-based MCP servers\n- LLM API key(s) from\n  [OpenAI](https://platform.openai.com/api-keys),\n  [Anthropic](https://console.anthropic.com/settings/keys),\n  [Google AI Studio (for GenAI/Gemini)](https://aistudio.google.com/apikey),\n  [xAI](https://console.x.ai/),\n  [Cerebras](https://cloud.cerebras.ai),\n  and/or\n  [Groq](https://console.groq.com/keys),\n  as needed\n\n## Quick Start\n\n- Install `mcp-chat` tool.\n  This can take up to a few minutes to complete:\n  ```bash\n  pip install mcp-chat\n  ```\n\n- Configure LLM and MCP Servers settings via the configuration file, `llm_mcp_config.json5`\n  ```bash\n  code llm_mcp_config.json5\n  ```\n\n  The following is a simple configuration for quick testing:\n  ```json5\n  {\n    \"llm\": {\n      \"provider\": \"openai\", \"model\": \"gpt-5-mini\",\n      // \"provider\": \"anthropic\",    \"model\": \"claude-3-5-haiku-latest\",\n      // \"provider\": \"google_genai\", \"model\": \"gemini-2.5-flash\",\n      // \"provider\": \"xai\",          \"model\": \"grok-3-mini\",\n      // \"provider\": \"cerebras\",     \"model\": \"gpt-oss-120b\",\n      // \"provider\": \"grok\",         \"model\": \"openai/gpt-oss-20b\",\n    },\n\n    \"mcp_servers\": {\n      \"us-weather\": {  // US weather only\n        \"command\": \"npx\", \n        \"args\": [\"-y\", \"@h1deya/mcp-server-weather\"]\n      },\n    },\n\n    \"example_queries\": [\n      \"Tell me how LLMs work in a few sentences\",\n      \"Are there any weather alerts in California?\",\n    ],\n  }\n  ```\n\n- Set up API keys\n  ```bash\n  echo \"ANTHROPIC_API_KEY=sk-ant-...                                       \n  OPENAI_API_KEY=sk-proj-...\n  GOOGLE_API_KEY=AI...\n  XAI_API_KEY=xai-...\n  CEREBRAS_API_KEY=csk-...\n  GROQ_API_KEY=gsk_...\" > .env\n  \n  code .env\n  ```\n\n- Run the tool\n  ```bash\n  mcp-chat\n  ```\n  By default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory.  \n  Then, it applies the environment variables specified in the `.env` file,\n  as well as the ones that are already defined.\n\n## Building from Source\n\nSee [README_DEV.md](https://github.com/hideya/mcp-client-langchain-py/blob/main/README_DEV.md) for details.\n\n## Features\n\n- **Easy setup**: Works out of the box with popular MCP servers\n- **Flexible configuration**: JSON5 config with environment variable support\n- **Multiple LLM/API providers**: OpenAI, Anthropic, Google (GenAI), xAI, Ceberas, Groq\n- **Command & URL servers**: Support for both local and remote MCP servers\n- **Local MCP Server logging**: Save stdio MCP server logs with customizable log directory\n- **Interactive testing**: Example queries for the convenience of repeated testing\n\n## Limitations\n\n- **Tool Return Types**: Currently, only text results of tool calls are supported.\nIt uses LangChain's `response_format: 'content'` (the default) internally, which only supports text strings.\nWhile MCP tools can return multiple content types (text, images, etc.), this library currently filters and uses only text content.\n- **MCP Features**: Only MCP [Tools](https://modelcontextprotocol.io/docs/concepts/tools) are supported. Other MCP features like Resources, Prompts, and Sampling are not implemented.\n\n## Usage\n\n### Basic Usage\n\n```bash\nmcp-chat\n```\n\nBy default, it reads the configuration file, `llm_mcp_config.json5`, from the current directory.  \nThen, it applies the environment variables specified in the `.env` file,\nas well as the ones that are already defined.  \nIt outputs local MCP server logs to the current directory.\n\n### With Options\n\n```bash\n# Specify the config file to use\nmcp-chat --config my-config.json5\n\n# Store local (stdio) MCP server logs in specific directory\nmcp-chat --log-dir ./logs\n\n# Enable verbose logging\nmcp-chat --verbose\n\n# Show help\nmcp-chat --help\n```\n\n## Supported Model/API Providers\n\n- **OpenAI**: `gpt-4o`, `gpt-4o-mini`, etc.\n- **Anthropic**: `claude-sonnet-4-0`, `claude-3-5-haiku-latest`, etc.\n- **Google (GenAI)**: `gemini-2.0-flash`, `gemini-1.5-pro`, etc.\n- **xAI**: `grok-3-mini`, `grok-4`, etc.\n- **Cerebras**: `gpt-oss-120b`, etc.\n- **Groq**: `openai/gpt-oss-20b`, `openai/gpt-oss-120b`, etc.\n\n## Configuration\n\nCreate a `llm_mcp_config.json5` file:\n\n- [The configuration file format](https://github.com/hideya/mcp-client-langchain-py/blob/main/llm_mcp_config.json5)\n  for MCP servers follows the same structure as\n  [Claude for Desktop](https://modelcontextprotocol.io/quickstart/user),\n  with one difference: the key name `mcpServers` has been changed\n  to `mcp_servers` to follow the snake_case convention\n  commonly used in JSON configuration files.\n- The file format is [JSON5](https://json5.org/),\n  where comments and trailing commas are allowed.\n- The format is further extended to replace `${...}` notations\n  with the values of corresponding environment variables.\n- Keep all the credentials and private info in the `.env` file\n  and refer to them with `${...}` notation as needed\n\n```json5\n{\n  \"llm\": {\n    \"provider\": \"openai\",\n    \"model\": \"gpt-4.1-nano\",\n    // model: \"gpt-5-mini\",\n  },\n\n  // \"llm\": {\n  //   \"provider\": \"anthropic\",\n  //   \"model\": \"claude-3-5-haiku-latest\",\n  //   // \"model\": \"claude-sonnet-4-0\",\n  // },\n\n  // \"llm\": {\n  //   \"provider\": \"google_genai\",\n  //   \"model\": \"gemini-2.5-flash\",\n  //   // \"model\": \"gemini-2.5-pro\",\n  // },\n\n  // \"llm\": {\n  //   \"provider\": \"xai\",\n  //   \"model\": \"grok-3-mini\",\n  //   // \"model\": \"grok-4\",\n  // },\n\n  // \"llm\": {\n  //   \"provider\": \"cerebras\",\n  //   \"model\": \"gpt-oss-120b\",\n  // },\n\n  // \"llm\": {\n  //   \"provider\": \"groq\",\n  //   \"model\": \"openai/gpt-oss-20b\",\n  //   // \"model\": \"openai/gpt-oss-120b\",\n  // },\n\n  \"example_queries\": [\n    \"Tell me how LLMs work in a few sentences\",\n    \"Are there any weather alerts in California?\",\n    \"Read the news headlines on bbc.com\",\n  ],\n\n  \"mcp_servers\": {\n    // Local MCP server that uses `npx`\n    \"weather\": {\n      \"command\": \"npx\", \n      \"args\": [ \"-y\", \"@h1deya/mcp-server-weather\" ]\n    },\n\n    // Another local server that uses `uvx`\n    \"fetch\": {\n      \"command\": \"uvx\",\n      \"args\": [ \"mcp-server-fetch\" ]\n    },\n\n    \"brave-search\": {\n      \"command\": \"npx\",\n      \"args\": [ \"-y\", \"@modelcontextprotocol/server-brave-search\" ],\n      \"env\": { \"BRAVE_API_KEY\": \"${BRAVE_API_KEY}\" }\n    },\n\n    // Remote MCP server via URL\n    // Auto-detection: tries Streamable HTTP first, falls back to SSE\n    \"remote-mcp-server\": {\n      \"url\": \"https://api.example.com/...\"\n    },\n\n    // Server with authentication\n    \"github\": {\n      \"type\": \"http\",  // recommended to specify the protocol explicitly when authentication is used\n      \"url\": \"https://api.githubcopilot.com/mcp/\",\n      \"headers\": {\n        \"Authorization\": \"Bearer ${GITHUB_PERSONAL_ACCESS_TOKEN}\"\n      }\n    }\n  }\n}\n```\n\n### Environment Variables\n\nCreate a `.env` file for API keys:\n\n```bash\nOPENAI_API_KEY=sk-ant-...\nANTHROPIC_API_KEY=sk-proj-...\nGOOGLE_API_KEY=AI...\nXAI_API_KEY=xai-...\nCEREBRAS_API_KEY=csk-...\nGROQ_API_KEY=gsk_...\n\n# Other services as needed\nGITHUB_PERSONAL_ACCESS_TOKEN=github_pat_...\nBRAVE_API_KEY=BSA...\n```\n\n## Popular MCP Servers to Try\n\nThere are quite a few useful MCP servers already available:\n\n- [MCP Server Listing on the Official Site](https://github.com/modelcontextprotocol/servers?tab=readme-ov-file#model-context-protocol-servers)\n\n## Troubleshooting\n\n- Make sure your configuration and .env files are correct, especially the spelling of the API keys\n- Check the local MCP server logs\n- Use `--verbose` flag to view the detailed logs\n- Refer to [Debugging Section in MCP documentation](https://modelcontextprotocol.io/docs/tools/debugging)\n\n## Change Log\n\nCan be found [here](https://github.com/hideya/mcp-client-langchain-py/blob/main/CHANGELOG.md)\n\n## License\n\nMIT License - see [LICENSE](https://github.com/hideya/mcp-client-langchain-py/blob/main/LICENSE) file for details.\n\n## Contributing\n\nIssues and pull requests welcome! This tool aims to make MCP server testing as simple as possible.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Simple MCP Client to quickly test and explore MCP servers from the command line",
    "version": "0.2.13",
    "project_urls": {
        "Bug Tracker": "https://github.com/hideya/mcp-client-langchain-py/issues",
        "Source Code": "https://github.com/hideya/mcp-client-langchain-py"
    },
    "split_keywords": [
        "cli",
        " client",
        " explore",
        " langchain",
        " mcp",
        " model-context-protocol",
        " python",
        " quick",
        " simple",
        " test",
        " tools",
        " try"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bd0497321282e190b0143dcefb129d1c25f033c3b335342e6cea656ced0eef11",
                "md5": "a646a102a4eeb0ea8c15f3f55ef6a1fa",
                "sha256": "e8649e040e6e47571b0257e5c67254baa8670d8e6c06c5f31e6a411c7ddac497"
            },
            "downloads": -1,
            "filename": "mcp_chat-0.2.13-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a646a102a4eeb0ea8c15f3f55ef6a1fa",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 11940,
            "upload_time": "2025-08-14T06:24:17",
            "upload_time_iso_8601": "2025-08-14T06:24:17.292667Z",
            "url": "https://files.pythonhosted.org/packages/bd/04/97321282e190b0143dcefb129d1c25f033c3b335342e6cea656ced0eef11/mcp_chat-0.2.13-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f9e174d51138a690ffc4e43f750a875dd655d04bf52e6a7613baef701c80f50b",
                "md5": "86a0db47ef840ad811c55a28e241b17e",
                "sha256": "5d044bb6e7c1f325e3255a46a9f1b11617336d546baa982ada9ca2ef791c0433"
            },
            "downloads": -1,
            "filename": "mcp_chat-0.2.13.tar.gz",
            "has_sig": false,
            "md5_digest": "86a0db47ef840ad811c55a28e241b17e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 142857,
            "upload_time": "2025-08-14T06:24:19",
            "upload_time_iso_8601": "2025-08-14T06:24:19.385184Z",
            "url": "https://files.pythonhosted.org/packages/f9/e1/74d51138a690ffc4e43f750a875dd655d04bf52e6a7613baef701c80f50b/mcp_chat-0.2.13.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-14 06:24:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "hideya",
    "github_project": "mcp-client-langchain-py",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "mcp-chat"
}
        
Elapsed time: 1.19791s