Name | fire-chat JSON |
Version |
0.0.6
JSON |
| download |
home_page | None |
Summary | A CLI tool to chat with LLM models including GPT and Claude. |
upload_time | 2024-10-10 22:20:03 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
llm
anthropic
chatgpt
claude
cli
openai
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# fire-chat
## Overview
This project provides a command-line interface (CLI) for interacting with various large language models (LLMs) using the
LiteLLM wrapper. It supports multiple providers, including OpenAI, Anthropic, Azure, and Gemini. The CLI allows users to
chat with these models, manage budgets, and handle API keys efficiently.
## Configuration
The configuration is managed through a `$HOME/.config/fire-chat/config.yaml` file. The first time you run the CLI run.
You can copy paste the starting config file [config.yaml](examples/config.yaml) to the location, adds your API key,
and quick start the application `fire-chat`.
## Installation and Usage
1. **Install the CLI**:
```shell
pip install --user fire-chat # requires python 3.10+
```
2. **Configure the CLI**:
Edit the `$HOME/.config/fire-chat/config.yaml` file to set your preferred provider, model, and other settings.
3. **Run the CLI**:
```shell
fire-chat
```
or run with arguments (overriding config yaml file)
```shell
fire-chat --model=gpt-4o
```
for full list of configs, see [main.py](src/fire_chat/main.py).
4. **Exit**:
To exit the CLI, `Ctrl+C`.
Raw data
{
"_id": null,
"home_page": null,
"name": "fire-chat",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": "Tiansu Yu <tiansu.yu@icloud.com>, Tadeas Fort <business@tadeasfort.com>",
"keywords": "LLM, anthropic, chatGPT, claude, cli, openai",
"author": null,
"author_email": "Tiansu Yu <tiansu.yu@icloud.com>, Marco Lardera <larderamarco@hotmail.com>, Tadeas Fort <business@tadeasfort.com>",
"download_url": "https://files.pythonhosted.org/packages/ab/19/96d62e668137728f62ddf0bd4da2efa6a3e4770a86c0f75b88c34278eff9/fire_chat-0.0.6.tar.gz",
"platform": null,
"description": "# fire-chat\n\n## Overview\n\nThis project provides a command-line interface (CLI) for interacting with various large language models (LLMs) using the\nLiteLLM wrapper. It supports multiple providers, including OpenAI, Anthropic, Azure, and Gemini. The CLI allows users to\nchat with these models, manage budgets, and handle API keys efficiently.\n\n## Configuration\n\nThe configuration is managed through a `$HOME/.config/fire-chat/config.yaml` file. The first time you run the CLI run.\nYou can copy paste the starting config file [config.yaml](examples/config.yaml) to the location, adds your API key,\nand quick start the application `fire-chat`.\n\n## Installation and Usage\n\n1. **Install the CLI**:\n\n ```shell\n pip install --user fire-chat # requires python 3.10+\n ```\n\n2. **Configure the CLI**:\n\n Edit the `$HOME/.config/fire-chat/config.yaml` file to set your preferred provider, model, and other settings.\n\n3. **Run the CLI**:\n\n ```shell\n fire-chat\n ```\n\n or run with arguments (overriding config yaml file)\n\n ```shell\n fire-chat --model=gpt-4o\n ```\n\n for full list of configs, see [main.py](src/fire_chat/main.py).\n\n4. **Exit**:\n To exit the CLI, `Ctrl+C`.\n",
"bugtrack_url": null,
"license": null,
"summary": "A CLI tool to chat with LLM models including GPT and Claude.",
"version": "0.0.6",
"project_urls": {
"Homepage": "https://github.com/TiansuYu/fire-chat",
"Issues": "https://github.com/TiansuYu/fire-chat/issues",
"Repository": "https://github.com/TiansuYu/fire-chat"
},
"split_keywords": [
"llm",
" anthropic",
" chatgpt",
" claude",
" cli",
" openai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "8f2e16a001d6ce2eef706d5474c65389949b393ffe2104c593ee5e94f02db3af",
"md5": "42aa2c9b9d3f91be785599fb913ca7b0",
"sha256": "ff3f32a8d5bab44de8fddc4591e5f68ff0c2e1835a67f5719a33e2639b9e506e"
},
"downloads": -1,
"filename": "fire_chat-0.0.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "42aa2c9b9d3f91be785599fb913ca7b0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 17611,
"upload_time": "2024-10-10T22:20:02",
"upload_time_iso_8601": "2024-10-10T22:20:02.164057Z",
"url": "https://files.pythonhosted.org/packages/8f/2e/16a001d6ce2eef706d5474c65389949b393ffe2104c593ee5e94f02db3af/fire_chat-0.0.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "ab1996d62e668137728f62ddf0bd4da2efa6a3e4770a86c0f75b88c34278eff9",
"md5": "ff4f39cdd2573fc479e720d3a0678c67",
"sha256": "62d8911d495bf26a7a8e26deb363a99e64dcf72486d8354ee1f4b8da710f89cb"
},
"downloads": -1,
"filename": "fire_chat-0.0.6.tar.gz",
"has_sig": false,
"md5_digest": "ff4f39cdd2573fc479e720d3a0678c67",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 86093,
"upload_time": "2024-10-10T22:20:03",
"upload_time_iso_8601": "2024-10-10T22:20:03.802950Z",
"url": "https://files.pythonhosted.org/packages/ab/19/96d62e668137728f62ddf0bd4da2efa6a3e4770a86c0f75b88c34278eff9/fire_chat-0.0.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-10 22:20:03",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "TiansuYu",
"github_project": "fire-chat",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "fire-chat"
}