chat-cli-anything


Namechat-cli-anything JSON
Version 0.1.8 PyPI version JSON
download
home_pageNone
SummaryChat with anything on cli.
upload_time2024-04-23 16:50:37
maintainerNone
docs_urlNone
authorliuping
requires_python<4.0,>=3.9
licenseApache
keywords chatgpt cli chat
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![chat-cli-anything](./logo.png)

```
# Chat-Cli-Anything

Interact with GPT-like services from the command line, supporting local RAG, and pipe input.

Why:
① Developers spend a significant amount of time using the terminal to interact with the system, and when encountering issues in the terminal, they expect not to have to switch to a search engine or elsewhere.
② By leveraging the language model's excellent text capabilities and text processing abilities, it helps us achieve code generation, explanation, and translation beyond simple searches, as well as document Q&A.

## Tutorial

1. Installation

```shell
pip install "chat-cli-anything"
```

To simplify command length, you can set command aliases (optional):

```shell
# put this in your ~/.bashrc or ~/.zshrc
alias config="cc-config"
alias code="cc-code"
alias chat="cc-chat"
alias db="cc-db"
alias service="cc-service"
```

2. Add LLM provider

```
cc-config add "openai" "https://api.openai.com/v1"  --api-key "your-api-key"
```

If command aliases are set:

```
config add "openai" "https://api.openai.com/v1"  --api-key "your-api-key"
```

3. Start Using

**example 1** Normal question

```
cc-ask "Who is the author of the science fiction novel 'Three Body'?"
```

```
cc-ask "How to get the memory size occupied by a process in Linux?"
```

**example 2** Using pipe

```
# Find the filename with the maximum size in the current directory
ls -lah | cc-ask "filename with max size"
```

```
# For debugging error
gcc -o a.out a.cpp 2>&1 | cc-ask "what is the error" -s
```

**example 3** Interactive questioning

```
cc-chat "What is the capital of France?"
```

**example 4** Local document Q&A

```
cc-ask "Which school did the candidate graduate from?" -f resume.pdf
```

**example 5** Local document collection

Step 1: Digest text to build a local database

```
# d2light code
cc-db ingest "/path/to/detectron2-light" -n d2light
```

Step 2: Ask a question

```
# For open questions, the '-a/--advance' option would be helpful
cc-ask -d d2light "DETR implementation" -a
```

**example 6** Code explanation

```
# Explain line by line
cc-code explain /path/to/your/code.py -l

# If the code is lengthy, add the -l parameter to continue generating
cc-code explain /path/to/your/code.py -l -c
```

### Notice
It depends on `torch` version 2.2, so it may override the current environment version.

## Command Explanation

### cc-config

Configure LLM provider

#### 1. Add LLM provider

```
cc-config add [OPTIONS] NAME BASE_URL

Options:
  --model TEXT
  --api-key TEXT
  --proxy TEXT
  --help          Show this message and exit.
```

Example:

```
cc-config add "openai" "https://api.openai.com/v1"  --model "gpt-3.5-turbo-1106" --api-key "sk-xxxxxxxxxxx"
```

#### 2. Test provider

```
cc-config ping [OPTIONS] NAME

Ping provider.

Options:
  --help  Show this message and exit.
```

#### 3. List all providers

```
cc-config list [OPTIONS]

List all configurations.

Options:
  -s, --show-api-key
  --help              Show this message and exit.
```

The default api-key is hidden; use `-s/--show-api-key` to display the key.

#### 4. Remove provider

```
cc-config remove [OPTIONS] [NAME]

Remove a configuration.

Options:
  --help  Show this message and exit.
```

#### 5. Switch active provider

```
ask.py cc-config switch [OPTIONS] NAME

Switch to a different configuration and save the change.

Options:
  --help  Show this message and exit.
```

#### 6. Load configuration file

```
Import configurations.

Options:
  -o, --override  Whether to override the original config.
  --help          Show this message and exit.
```

#### 7. Export configuration file

```
ask.py cc-config dump [OPTIONS] PATH

Export current configurations.

Options:
  -o, --override  Whether to override the existing file.
  --help          Show this message and exit.
```

### cc-ask

Perform a question-answering task

```
cc-ask [OPTIONS] QUERY

Start a chat session with the given query.

Options:
  -d, --db TEXT        Name of database.
  -f, --filename TEXT  Name of file.
  -r, --rerank         Whether to rerank the results.
  -s, --show-chunks    Whether to show the related chunks retrieved from
                       the database.
  -a, --advance        Whether to use advance RAG.
  --help               Show this message and exit.
```

### cc-chat

Interactive Q&A

```
cc-chat [OPTIONS] [QUERY]

Interactive chat. Enter '/quit' to exit.

Options:
  -d, --db TEXT          Name of the database.
  -f, --filename TEXT    Name of the file.
  -n, --not-interactive
  -s, --show-history
  -c, --clear
  --help                 Show this message and exit.
```

### cc-code

Some common commands for code based on cc-ask.

#### Code Explanation

```
cc-code explain [OPTIONS] [FILENAME]

Explain code.

Options:
  -o, --object-name TEXT
  -l, --line               Line by line.
  -c, --continue-generate
  --help                   Show this message and exit.
```

Example 1: Specify filename

```
cc-code explain some_complex_scripts.py
```

Example 2: Specify a particular function

```
cc-code explain some_complex_scripts.py -o SomeClass::some_function
```

Example 3: Use pipe

```
cat some_complex_scripts.py | cc-code explain
```

#### Fix Issues

```
cc-code fix [OPTIONS] [FILENAME]

Fix code.

Options:
  -o, --object-name TEXT
  -c, --continue-generate
  --help                   Show this message and exit.
```

#### Code Refactoring

```
cc-code refactor [OPTIONS] [FILENAME]

Refactor code.

Options:
  -o, --object-name TEXT
  -c, --continue-generate
  --help                   Show this message and exit.
```

#### Code Review

```
cc-code review [OPTIONS] [FILENAME]

Review code.

Options:
  -o, --object-name TEXT
  -c, --continue-generate
  --help                   Show this message and exit.
```

#### Code Translation

```
cc-code translate [OPTIONS] LANGUAGE [FILENAME]

Translate code from one language to another. Supported languages include c++, cpp,
c, rust, typescript, javascript, markdown, html.

Options:
  -o, --object-name TEXT
  -c, --continue-generate
  --help                   Show this message and exit.
```

#### Select Code from Generated Results

Used to select code snippets from the generated responses of the above commands

```
cc-code select [OPTIONS] [INDEX]

Select code snippet from the last output.

Argument:     index: code snippet index

Options:
  -c, --count  get the number of code snippets
  --help       Show this message and exit.
```

`-c/--count` Get the number of candidate code blocks in the last response

Example:

```
>>> cc-code select -c
2

# Select the second code block from the last output
# For macOS
>>> cc-code select 1 | pbcopy
# For Linux
>>> cc-code select 1 | xclip -selection clipboard
```

### cc-db

#### list: List all text collections

```
cc-db list [OPTIONS] [NAME]

List all document databases.

Options:
  -s, --short  List in short format.
  --help       Show this message and exit.
```

#### ingest: Digest documents

```
cc-db ingest [OPTIONS] [FILES]...

Read documents and convert them into a searchable database.

Options:
  -n, --name TEXT     The name of the knowledge base.
  -m, --comment TEXT  Add comment to info.
  --help              Show this message and exit.
```

#### remove: Remove document collection

```
cc-db remove [OPTIONS] NAME

Remove database with the given name.

Options:
  -d, --remove-documents  Remove documents if data.
  --help                  Show this message and exit.
```

#### search: Search within a document collection

```
cc-db search [OPTIONS] DB QUERY

Options:
  -k, --topk INTEGER  Number of top results to return.
  --help              Show this message and exit.
```

### cc-service

Used to manage local text-to-index services.

If you want to run a text-to-index service on your local machine, execute the command `"pip install chat-cli-anything[all]"`.

#### start:

```
cc-service start [OPTIONS]

Options:

--help  Show this message and exit.
```

This process may take some time (~1min) to load the model, and you can check if it has started with `cc-service status`.

#### stop

```
cc-service stop [OPTIONS]

Options:
  --help  Show this message and exit.
```

#### status

```
cc-service status [OPTIONS]

Options:
  --help  Show this message and exit.
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "chat-cli-anything",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "chatgpt, cli, chat",
    "author": "liuping",
    "author_email": "liuping@shukun.net",
    "download_url": "https://files.pythonhosted.org/packages/39/89/652d97cf1ed47595cd961805658a6f157be739694cd242f920cd8d29efb7/chat_cli_anything-0.1.8.tar.gz",
    "platform": null,
    "description": "![chat-cli-anything](./logo.png)\n\n```\n# Chat-Cli-Anything\n\nInteract with GPT-like services from the command line, supporting local RAG, and pipe input.\n\nWhy:\n\u2460 Developers spend a significant amount of time using the terminal to interact with the system, and when encountering issues in the terminal, they expect not to have to switch to a search engine or elsewhere.\n\u2461 By leveraging the language model's excellent text capabilities and text processing abilities, it helps us achieve code generation, explanation, and translation beyond simple searches, as well as document Q&A.\n\n## Tutorial\n\n1. Installation\n\n```shell\npip install \"chat-cli-anything\"\n```\n\nTo simplify command length, you can set command aliases (optional):\n\n```shell\n# put this in your ~/.bashrc or ~/.zshrc\nalias config=\"cc-config\"\nalias code=\"cc-code\"\nalias chat=\"cc-chat\"\nalias db=\"cc-db\"\nalias service=\"cc-service\"\n```\n\n2. Add LLM provider\n\n```\ncc-config add \"openai\" \"https://api.openai.com/v1\"  --api-key \"your-api-key\"\n```\n\nIf command aliases are set:\n\n```\nconfig add \"openai\" \"https://api.openai.com/v1\"  --api-key \"your-api-key\"\n```\n\n3. Start Using\n\n**example 1** Normal question\n\n```\ncc-ask \"Who is the author of the science fiction novel 'Three Body'?\"\n```\n\n```\ncc-ask \"How to get the memory size occupied by a process in Linux?\"\n```\n\n**example 2** Using pipe\n\n```\n# Find the filename with the maximum size in the current directory\nls -lah | cc-ask \"filename with max size\"\n```\n\n```\n# For debugging error\ngcc -o a.out a.cpp 2>&1 | cc-ask \"what is the error\" -s\n```\n\n**example 3** Interactive questioning\n\n```\ncc-chat \"What is the capital of France?\"\n```\n\n**example 4** Local document Q&A\n\n```\ncc-ask \"Which school did the candidate graduate from?\" -f resume.pdf\n```\n\n**example 5** Local document collection\n\nStep 1: Digest text to build a local database\n\n```\n# d2light code\ncc-db ingest \"/path/to/detectron2-light\" -n d2light\n```\n\nStep 2: Ask a question\n\n```\n# For open questions, the '-a/--advance' option would be helpful\ncc-ask -d d2light \"DETR implementation\" -a\n```\n\n**example 6** Code explanation\n\n```\n# Explain line by line\ncc-code explain /path/to/your/code.py -l\n\n# If the code is lengthy, add the -l parameter to continue generating\ncc-code explain /path/to/your/code.py -l -c\n```\n\n### Notice\nIt depends on `torch` version 2.2, so it may override the current environment version.\n\n## Command Explanation\n\n### cc-config\n\nConfigure LLM provider\n\n#### 1. Add LLM provider\n\n```\ncc-config add [OPTIONS] NAME BASE_URL\n\nOptions:\n  --model TEXT\n  --api-key TEXT\n  --proxy TEXT\n  --help          Show this message and exit.\n```\n\nExample:\n\n```\ncc-config add \"openai\" \"https://api.openai.com/v1\"  --model \"gpt-3.5-turbo-1106\" --api-key \"sk-xxxxxxxxxxx\"\n```\n\n#### 2. Test provider\n\n```\ncc-config ping [OPTIONS] NAME\n\nPing provider.\n\nOptions:\n  --help  Show this message and exit.\n```\n\n#### 3. List all providers\n\n```\ncc-config list [OPTIONS]\n\nList all configurations.\n\nOptions:\n  -s, --show-api-key\n  --help              Show this message and exit.\n```\n\nThe default api-key is hidden; use `-s/--show-api-key` to display the key.\n\n#### 4. Remove provider\n\n```\ncc-config remove [OPTIONS] [NAME]\n\nRemove a configuration.\n\nOptions:\n  --help  Show this message and exit.\n```\n\n#### 5. Switch active provider\n\n```\nask.py cc-config switch [OPTIONS] NAME\n\nSwitch to a different configuration and save the change.\n\nOptions:\n  --help  Show this message and exit.\n```\n\n#### 6. Load configuration file\n\n```\nImport configurations.\n\nOptions:\n  -o, --override  Whether to override the original config.\n  --help          Show this message and exit.\n```\n\n#### 7. Export configuration file\n\n```\nask.py cc-config dump [OPTIONS] PATH\n\nExport current configurations.\n\nOptions:\n  -o, --override  Whether to override the existing file.\n  --help          Show this message and exit.\n```\n\n### cc-ask\n\nPerform a question-answering task\n\n```\ncc-ask [OPTIONS] QUERY\n\nStart a chat session with the given query.\n\nOptions:\n  -d, --db TEXT        Name of database.\n  -f, --filename TEXT  Name of file.\n  -r, --rerank         Whether to rerank the results.\n  -s, --show-chunks    Whether to show the related chunks retrieved from\n                       the database.\n  -a, --advance        Whether to use advance RAG.\n  --help               Show this message and exit.\n```\n\n### cc-chat\n\nInteractive Q&A\n\n```\ncc-chat [OPTIONS] [QUERY]\n\nInteractive chat. Enter '/quit' to exit.\n\nOptions:\n  -d, --db TEXT          Name of the database.\n  -f, --filename TEXT    Name of the file.\n  -n, --not-interactive\n  -s, --show-history\n  -c, --clear\n  --help                 Show this message and exit.\n```\n\n### cc-code\n\nSome common commands for code based on cc-ask.\n\n#### Code Explanation\n\n```\ncc-code explain [OPTIONS] [FILENAME]\n\nExplain code.\n\nOptions:\n  -o, --object-name TEXT\n  -l, --line               Line by line.\n  -c, --continue-generate\n  --help                   Show this message and exit.\n```\n\nExample 1: Specify filename\n\n```\ncc-code explain some_complex_scripts.py\n```\n\nExample 2: Specify a particular function\n\n```\ncc-code explain some_complex_scripts.py -o SomeClass::some_function\n```\n\nExample 3: Use pipe\n\n```\ncat some_complex_scripts.py | cc-code explain\n```\n\n#### Fix Issues\n\n```\ncc-code fix [OPTIONS] [FILENAME]\n\nFix code.\n\nOptions:\n  -o, --object-name TEXT\n  -c, --continue-generate\n  --help                   Show this message and exit.\n```\n\n#### Code Refactoring\n\n```\ncc-code refactor [OPTIONS] [FILENAME]\n\nRefactor code.\n\nOptions:\n  -o, --object-name TEXT\n  -c, --continue-generate\n  --help                   Show this message and exit.\n```\n\n#### Code Review\n\n```\ncc-code review [OPTIONS] [FILENAME]\n\nReview code.\n\nOptions:\n  -o, --object-name TEXT\n  -c, --continue-generate\n  --help                   Show this message and exit.\n```\n\n#### Code Translation\n\n```\ncc-code translate [OPTIONS] LANGUAGE [FILENAME]\n\nTranslate code from one language to another. Supported languages include c++, cpp,\nc, rust, typescript, javascript, markdown, html.\n\nOptions:\n  -o, --object-name TEXT\n  -c, --continue-generate\n  --help                   Show this message and exit.\n```\n\n#### Select Code from Generated Results\n\nUsed to select code snippets from the generated responses of the above commands\n\n```\ncc-code select [OPTIONS] [INDEX]\n\nSelect code snippet from the last output.\n\nArgument:     index: code snippet index\n\nOptions:\n  -c, --count  get the number of code snippets\n  --help       Show this message and exit.\n```\n\n`-c/--count` Get the number of candidate code blocks in the last response\n\nExample:\n\n```\n>>> cc-code select -c\n2\n\n# Select the second code block from the last output\n# For macOS\n>>> cc-code select 1 | pbcopy\n# For Linux\n>>> cc-code select 1 | xclip -selection clipboard\n```\n\n### cc-db\n\n#### list: List all text collections\n\n```\ncc-db list [OPTIONS] [NAME]\n\nList all document databases.\n\nOptions:\n  -s, --short  List in short format.\n  --help       Show this message and exit.\n```\n\n#### ingest: Digest documents\n\n```\ncc-db ingest [OPTIONS] [FILES]...\n\nRead documents and convert them into a searchable database.\n\nOptions:\n  -n, --name TEXT     The name of the knowledge base.\n  -m, --comment TEXT  Add comment to info.\n  --help              Show this message and exit.\n```\n\n#### remove: Remove document collection\n\n```\ncc-db remove [OPTIONS] NAME\n\nRemove database with the given name.\n\nOptions:\n  -d, --remove-documents  Remove documents if data.\n  --help                  Show this message and exit.\n```\n\n#### search: Search within a document collection\n\n```\ncc-db search [OPTIONS] DB QUERY\n\nOptions:\n  -k, --topk INTEGER  Number of top results to return.\n  --help              Show this message and exit.\n```\n\n### cc-service\n\nUsed to manage local text-to-index services.\n\nIf you want to run a text-to-index service on your local machine, execute the command `\"pip install chat-cli-anything[all]\"`.\n\n#### start:\n\n```\ncc-service start [OPTIONS]\n\nOptions:\n\n--help  Show this message and exit.\n```\n\nThis process may take some time (~1min) to load the model, and you can check if it has started with `cc-service status`.\n\n#### stop\n\n```\ncc-service stop [OPTIONS]\n\nOptions:\n  --help  Show this message and exit.\n```\n\n#### status\n\n```\ncc-service status [OPTIONS]\n\nOptions:\n  --help  Show this message and exit.\n```\n",
    "bugtrack_url": null,
    "license": "Apache",
    "summary": "Chat with anything on cli.",
    "version": "0.1.8",
    "project_urls": null,
    "split_keywords": [
        "chatgpt",
        " cli",
        " chat"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3aa516cf1d6ca97defc83046f02dd1013052d123f95ea67cb8679822a2dee181",
                "md5": "4a5aa3f6fc304c56462665cadef81de9",
                "sha256": "65bd377ba73948f5b60d128a4ed023ad28a2004b294ef34f1e263b1bec2c6855"
            },
            "downloads": -1,
            "filename": "chat_cli_anything-0.1.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4a5aa3f6fc304c56462665cadef81de9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 27324,
            "upload_time": "2024-04-23T16:50:35",
            "upload_time_iso_8601": "2024-04-23T16:50:35.517559Z",
            "url": "https://files.pythonhosted.org/packages/3a/a5/16cf1d6ca97defc83046f02dd1013052d123f95ea67cb8679822a2dee181/chat_cli_anything-0.1.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3989652d97cf1ed47595cd961805658a6f157be739694cd242f920cd8d29efb7",
                "md5": "8e8fdf770d6f566a2aed65ac7b9b384c",
                "sha256": "8c90877b3e0378e62f86f37a5cf846b78d0ae59525221991b49c1912bbad156c"
            },
            "downloads": -1,
            "filename": "chat_cli_anything-0.1.8.tar.gz",
            "has_sig": false,
            "md5_digest": "8e8fdf770d6f566a2aed65ac7b9b384c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 26047,
            "upload_time": "2024-04-23T16:50:37",
            "upload_time_iso_8601": "2024-04-23T16:50:37.349232Z",
            "url": "https://files.pythonhosted.org/packages/39/89/652d97cf1ed47595cd961805658a6f157be739694cd242f920cd8d29efb7/chat_cli_anything-0.1.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-23 16:50:37",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "chat-cli-anything"
}
        
Elapsed time: 0.25196s