kel-cli


Namekel-cli JSON
Version 0.0.13 PyPI version JSON
download
home_page
SummaryAI assistant in your CLI.
upload_time2024-01-07 17:14:11
maintainer
docs_urlNone
authorNaveenKumar Namachivayam
requires_python>=3.10,<4.0
licenseMIT
keywords gpt cli llm openai anthropic kel google ollama ai artificial intelligence assistant chatbot chat
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 💬 Kel

[![Install with pip](https://img.shields.io/badge/Install_with-pip-blue)](https://pypi.org/project/kel-cli)
![PyPI - Version](https://img.shields.io/pypi/v/kel-cli)

Kel is your AI assistant in your CLI. 

> Kel `கேள்` means `ask` in Tamil.

## 🎥 Demo

![Kel-Demo](https://raw.githubusercontent.com/QAInsights/kel-docs/main/static/img/kel-demo.gif)

## ✅ Features

- Free and Open Source
- Bring your own API keys
- Supports multiple Large Language Models (LLMs) like GPT-4, Claude, ollama2, and Google.
- Supports OpenAI assistants to chat with your documents
- Customizable

## 🧩 Installation

### Pre-requisites
- Python 3.6 or higher
- pip3
- API keys for OpenAI and other LLMs

### ⛳ Steps

```bash
python3 -m pip install --user pipx
python3 -m pipx ensurepath
pipx install kel-cli

# copy the default config file to current user's home directory
curl -O https://raw.githubusercontent.com/QAInsights/kel/main/config.toml
mkdir -p ~/.kel
mv config.toml ~/.kel/config.toml 
```
Open the config file to customize the settings.

Update the LLM API keys in the OS environment variables. 

## Usage

```bash
kel -v
```

```bash
kel -h
```

```bash
kel "git command to rebase"
```

```bash
kel "command to get active connections in linux"
```

```bash
kel "What was the population of India in 1990?"

> I'm sorry, I can only assist with questions related to software engineering and command line tools. 
I am unable to provide information on the population of India in 1990.
```

Now change the prompt and ask the same question.
```bash
kel "What was the population of India in 1990?" -p "You are a demography expert" 

> The population of India in 1990 was around 874 million people.
```

Now change the LLM and ask the same question.
```bash
kel "What was the population of India in 1990?" -p "You are a demography expert" -c ollama -m llama2 
```

To view the config file details, run the following command.
```bash
kel -s openai
```

> [!IMPORTANT]  
> LLMs price varies based on the usage. Please check the pricing before using it.  
> LLMs can make mistakes. Review the answers before using it.  


## 🧰 Configuration

Kel can be configured using a [config file](./config.toml). It is a TOML file and supports vast number of options. 

The default config file is `~/.kel/config.toml` or `~/.config/kel/config.toml` or `KEL_CONFIG_FILE` environment variable.

## ⚙️ Defaults

- OpenAI's `gpt-3.5-turbo-1106`
- Display stats
- Default prompt focuses on developers
- Copies the answer to clipboard
- and more...

## 💰 Support

If you like this project, please consider donating to the following addresses.

- Buy me a coffee: https://www.buymeacoffee.com/qainsights





            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "kel-cli",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10,<4.0",
    "maintainer_email": "",
    "keywords": "gpt,cli,llm,openai,anthropic,kel,google,ollama,ai,artificial intelligence,assistant,chatbot,chat",
    "author": "NaveenKumar Namachivayam",
    "author_email": "catch.nkn@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/f8/e2/95bf5c5466b8f0578a707f29d4ad6164a5c9b7eece5ef4e3de18c7d56cfa/kel_cli-0.0.13.tar.gz",
    "platform": null,
    "description": "# \ud83d\udcac Kel\n\n[![Install with pip](https://img.shields.io/badge/Install_with-pip-blue)](https://pypi.org/project/kel-cli)\n![PyPI - Version](https://img.shields.io/pypi/v/kel-cli)\n\nKel is your AI assistant in your CLI. \n\n> Kel `\u0b95\u0bc7\u0bb3\u0bcd` means `ask` in Tamil.\n\n## \ud83c\udfa5 Demo\n\n![Kel-Demo](https://raw.githubusercontent.com/QAInsights/kel-docs/main/static/img/kel-demo.gif)\n\n## \u2705 Features\n\n- Free and Open Source\n- Bring your own API keys\n- Supports multiple Large Language Models (LLMs) like GPT-4, Claude, ollama2, and Google.\n- Supports OpenAI assistants to chat with your documents\n- Customizable\n\n## \ud83e\udde9 Installation\n\n### Pre-requisites\n- Python 3.6 or higher\n- pip3\n- API keys for OpenAI and other LLMs\n\n### \u26f3 Steps\n\n```bash\npython3 -m pip install --user pipx\npython3 -m pipx ensurepath\npipx install kel-cli\n\n# copy the default config file to current user's home directory\ncurl -O https://raw.githubusercontent.com/QAInsights/kel/main/config.toml\nmkdir -p ~/.kel\nmv config.toml ~/.kel/config.toml \n```\nOpen the config file to customize the settings.\n\nUpdate the LLM API keys in the OS environment variables. \n\n## Usage\n\n```bash\nkel -v\n```\n\n```bash\nkel -h\n```\n\n```bash\nkel \"git command to rebase\"\n```\n\n```bash\nkel \"command to get active connections in linux\"\n```\n\n```bash\nkel \"What was the population of India in 1990?\"\n\n> I'm sorry, I can only assist with questions related to software engineering and command line tools. \nI am unable to provide information on the population of India in 1990.\n```\n\nNow change the prompt and ask the same question.\n```bash\nkel \"What was the population of India in 1990?\" -p \"You are a demography expert\" \n\n> The population of India in 1990 was around 874 million people.\n```\n\nNow change the LLM and ask the same question.\n```bash\nkel \"What was the population of India in 1990?\" -p \"You are a demography expert\" -c ollama -m llama2 \n```\n\nTo view the config file details, run the following command.\n```bash\nkel -s openai\n```\n\n> [!IMPORTANT]  \n> LLMs price varies based on the usage. Please check the pricing before using it.  \n> LLMs can make mistakes. Review the answers before using it.  \n\n\n## \ud83e\uddf0 Configuration\n\nKel can be configured using a [config file](./config.toml). It is a TOML file and supports vast number of options. \n\nThe default config file is `~/.kel/config.toml` or `~/.config/kel/config.toml` or `KEL_CONFIG_FILE` environment variable.\n\n## \u2699\ufe0f Defaults\n\n- OpenAI's `gpt-3.5-turbo-1106`\n- Display stats\n- Default prompt focuses on developers\n- Copies the answer to clipboard\n- and more...\n\n## \ud83d\udcb0 Support\n\nIf you like this project, please consider donating to the following addresses.\n\n- Buy me a coffee: https://www.buymeacoffee.com/qainsights\n\n\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "AI assistant in your CLI.",
    "version": "0.0.13",
    "project_urls": null,
    "split_keywords": [
        "gpt",
        "cli",
        "llm",
        "openai",
        "anthropic",
        "kel",
        "google",
        "ollama",
        "ai",
        "artificial intelligence",
        "assistant",
        "chatbot",
        "chat"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "382cdb3062f77b69d4299f5731ee6cde7eeb722d003d62f3c0707b852a3549ef",
                "md5": "9e3c1e81b8e3fbf8bb581c95b4a0f2a1",
                "sha256": "6348c0739ce1665aedb826520f5219f86bf4d76eb0b14b7acb9534c304f372c2"
            },
            "downloads": -1,
            "filename": "kel_cli-0.0.13-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9e3c1e81b8e3fbf8bb581c95b4a0f2a1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<4.0",
            "size": 20154,
            "upload_time": "2024-01-07T17:14:10",
            "upload_time_iso_8601": "2024-01-07T17:14:10.061567Z",
            "url": "https://files.pythonhosted.org/packages/38/2c/db3062f77b69d4299f5731ee6cde7eeb722d003d62f3c0707b852a3549ef/kel_cli-0.0.13-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f8e295bf5c5466b8f0578a707f29d4ad6164a5c9b7eece5ef4e3de18c7d56cfa",
                "md5": "87b0002728f47d25452a084cd955380c",
                "sha256": "442b8da3c3eca21fc9ae14936b83fe07c5a89be03fa2f950a3b4ba9ac836a43b"
            },
            "downloads": -1,
            "filename": "kel_cli-0.0.13.tar.gz",
            "has_sig": false,
            "md5_digest": "87b0002728f47d25452a084cd955380c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10,<4.0",
            "size": 31778,
            "upload_time": "2024-01-07T17:14:11",
            "upload_time_iso_8601": "2024-01-07T17:14:11.742150Z",
            "url": "https://files.pythonhosted.org/packages/f8/e2/95bf5c5466b8f0578a707f29d4ad6164a5c9b7eece5ef4e3de18c7d56cfa/kel_cli-0.0.13.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-07 17:14:11",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "kel-cli"
}
        
Elapsed time: 0.19287s