gpt4all-cli


Namegpt4all-cli JSON
Version 1.3.0 PyPI version JSON
download
home_pagehttps://github.com/sergei-mironov/gpt4all-cli
SummaryCommand-line interface using GPT4ALL bindings
upload_time2024-07-31 09:15:10
maintainerNone
docs_urlNone
authorSergei Mironov
requires_python>=3.6
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            A simple GNU Readline-based application for interaction with chat-oriented AI models using
[GPT4All](https://www.nomic.ai/gpt4all) Python bindings.

Contents
--------

<!-- vim-markdown-toc GFM -->

* [Install](#install)
    * [Pip](#pip)
    * [Nix](#nix)
* [Usage](#usage)
    * [Example session](#example-session)
* [Vim integration](#vim-integration)

<!-- vim-markdown-toc -->

Install
-------

The following installation options are available:

### Pip

```sh
$ pip install git+https://github.com/sergei-mironov/gpt4all-cli.git
```

Note: `pip install gpt4all-cli` might also work, but the `git+https` method would bring the most
recent version.

### Nix

```sh
$ git clone --depth=1 https://github.com/sergei-mironov/gpt4all-cli && cd gpt4all-cli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-gpt4all-cli"
```

Usage
-----

<!--
``` python
!gpt4all-cli --help
```
-->
``` result
usage: gpt4all-cli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]
                   [--num-threads NUM_THREADS] [--model-apikey STR]
                   [--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
                   [--readline-key-send READLINE_KEY_SEND]
                   [--readline-prompt READLINE_PROMPT]
                   [--readline-history FILE] [--verbose NUM] [--revision]

Command-line arguments

options:
  -h, --help            show this help message and exit
  --model-dir MODEL_DIR
                        Model directory to prepend to model file names
  --model [STR1:]STR2, -m [STR1:]STR2
                        Model to use. STR1 is 'gpt4all' (the default) or
                        'openai'. STR2 is the model name
  --num-threads NUM_THREADS, -t NUM_THREADS
                        Number of threads to use
  --model-apikey STR    Model provider-specific API key
  --model-temperature MODEL_TEMPERATURE
                        Temperature parameter of the model
  --device DEVICE, -d DEVICE
                        Device to use for chatbot, e.g. gpu, amd, nvidia,
                        intel. Defaults to CPU
  --readline-key-send READLINE_KEY_SEND
                        Terminal code to treat as Ctrl+Enter (default: \C-k)
  --readline-prompt READLINE_PROMPT
                        Input prompt (default: >>>)
  --readline-history FILE
                        History file name (default is '_gpt4all_cli_history';
                        set empty to disable)
  --verbose NUM         Set the verbosity level 0-no,1-full
  --revision            Print the revision
```

The console accepts language defined by the following grammar:

<!--
``` python
from gpt4all_cli import GRAMMAR
from textwrap import dedent
print(dedent(GRAMMAR).strip())
```
-->

``` result
start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/ask|\/exit|\/help|\/reset/ | \
           /\/model/ / +/ string | \
           /\/apikey/ / +/ string | \
           /\/nthreads/ / +/ (number | def) | \
           /\/verbose/ / +/ (number | def) | \
           /\/temp/ / +/ (float | def ) | \
           /\/echo/ | /\/echo/ / /
string: /"[^\"]+"/ | /""/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s
```

### Example session

``` sh
$ gpt4all-cli
```
``` txt
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"
```

Vim integration
---------------

Gpt4all-cli is supported by the [Litrepl](https://github.com/sergei-mironov/litrepl) text processor.

![Peek 2024-07-19 00-11](https://github.com/user-attachments/assets/7e5e59ea-bb96-4ebe-988f-726e83929dab)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/sergei-mironov/gpt4all-cli",
    "name": "gpt4all-cli",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": null,
    "author": "Sergei Mironov",
    "author_email": "sergei.v.mironov@proton.me",
    "download_url": "https://files.pythonhosted.org/packages/58/be/7ddc8b5d79ce0ad0088a5254fb0781d4d824617fa725a37437b96223f80a/gpt4all_cli-1.3.0.tar.gz",
    "platform": null,
    "description": "A simple GNU Readline-based application for interaction with chat-oriented AI models using\n[GPT4All](https://www.nomic.ai/gpt4all) Python bindings.\n\nContents\n--------\n\n<!-- vim-markdown-toc GFM -->\n\n* [Install](#install)\n    * [Pip](#pip)\n    * [Nix](#nix)\n* [Usage](#usage)\n    * [Example session](#example-session)\n* [Vim integration](#vim-integration)\n\n<!-- vim-markdown-toc -->\n\nInstall\n-------\n\nThe following installation options are available:\n\n### Pip\n\n```sh\n$ pip install git+https://github.com/sergei-mironov/gpt4all-cli.git\n```\n\nNote: `pip install gpt4all-cli` might also work, but the `git+https` method would bring the most\nrecent version.\n\n### Nix\n\n```sh\n$ git clone --depth=1 https://github.com/sergei-mironov/gpt4all-cli && cd gpt4all-cli\n# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable\n$ nix profile install \".#python-gpt4all-cli\"\n```\n\nUsage\n-----\n\n<!--\n``` python\n!gpt4all-cli --help\n```\n-->\n``` result\nusage: gpt4all-cli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]\n                   [--num-threads NUM_THREADS] [--model-apikey STR]\n                   [--model-temperature MODEL_TEMPERATURE] [--device DEVICE]\n                   [--readline-key-send READLINE_KEY_SEND]\n                   [--readline-prompt READLINE_PROMPT]\n                   [--readline-history FILE] [--verbose NUM] [--revision]\n\nCommand-line arguments\n\noptions:\n  -h, --help            show this help message and exit\n  --model-dir MODEL_DIR\n                        Model directory to prepend to model file names\n  --model [STR1:]STR2, -m [STR1:]STR2\n                        Model to use. STR1 is 'gpt4all' (the default) or\n                        'openai'. STR2 is the model name\n  --num-threads NUM_THREADS, -t NUM_THREADS\n                        Number of threads to use\n  --model-apikey STR    Model provider-specific API key\n  --model-temperature MODEL_TEMPERATURE\n                        Temperature parameter of the model\n  --device DEVICE, -d DEVICE\n                        Device to use for chatbot, e.g. gpu, amd, nvidia,\n                        intel. Defaults to CPU\n  --readline-key-send READLINE_KEY_SEND\n                        Terminal code to treat as Ctrl+Enter (default: \\C-k)\n  --readline-prompt READLINE_PROMPT\n                        Input prompt (default: >>>)\n  --readline-history FILE\n                        History file name (default is '_gpt4all_cli_history';\n                        set empty to disable)\n  --verbose NUM         Set the verbosity level 0-no,1-full\n  --revision            Print the revision\n```\n\nThe console accepts language defined by the following grammar:\n\n<!--\n``` python\nfrom gpt4all_cli import GRAMMAR\nfrom textwrap import dedent\nprint(dedent(GRAMMAR).strip())\n```\n-->\n\n``` result\nstart: (command | escape | text)? (command | escape | text)*\nescape.3: /\\\\./\ncommand.2: /\\/ask|\\/exit|\\/help|\\/reset/ | \\\n           /\\/model/ / +/ string | \\\n           /\\/apikey/ / +/ string | \\\n           /\\/nthreads/ / +/ (number | def) | \\\n           /\\/verbose/ / +/ (number | def) | \\\n           /\\/temp/ / +/ (float | def ) | \\\n           /\\/echo/ | /\\/echo/ / /\nstring: /\"[^\\\"]+\"/ | /\"\"/\nnumber: /[0-9]+/\nfloat: /[0-9]+\\.[0-9]*/\ndef: \"default\"\ntext: /(.(?!\\/|\\\\))*./s\n```\n\n### Example session\n\n``` sh\n$ gpt4all-cli\n```\n``` txt\nType /help or a question followed by the /ask command (or by pressing `C-k` key).\n>>> /model \"~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf\"\n>>> Hi!\n>>> /ask\nHello! I'm happy to help you. What's on your mind?^C\n>>> What's your name?\n>>> /ask\nI don't really have a personal name, but you can call me \"Assistant\"\n```\n\nVim integration\n---------------\n\nGpt4all-cli is supported by the [Litrepl](https://github.com/sergei-mironov/litrepl) text processor.\n\n![Peek 2024-07-19 00-11](https://github.com/user-attachments/assets/7e5e59ea-bb96-4ebe-988f-726e83929dab)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Command-line interface using GPT4ALL bindings",
    "version": "1.3.0",
    "project_urls": {
        "Homepage": "https://github.com/sergei-mironov/gpt4all-cli"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2e59c43fd435c425a40109cdbefba17d55a758b713baaec17a0403d3c483b3c8",
                "md5": "8168501771aa316eea5916e63d8374bf",
                "sha256": "dc68348a61b920ba45438ebedc3a5889803809568dc91cb990d496642cfde646"
            },
            "downloads": -1,
            "filename": "gpt4all_cli-1.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8168501771aa316eea5916e63d8374bf",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 10013,
            "upload_time": "2024-07-31T09:14:33",
            "upload_time_iso_8601": "2024-07-31T09:14:33.159130Z",
            "url": "https://files.pythonhosted.org/packages/2e/59/c43fd435c425a40109cdbefba17d55a758b713baaec17a0403d3c483b3c8/gpt4all_cli-1.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "58be7ddc8b5d79ce0ad0088a5254fb0781d4d824617fa725a37437b96223f80a",
                "md5": "86ae93635f874ef50d28c019ae501547",
                "sha256": "b57c6e42cd2eefa48faeca5f3ce232deff2a32e8468307a3a322d6934fe44319"
            },
            "downloads": -1,
            "filename": "gpt4all_cli-1.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "86ae93635f874ef50d28c019ae501547",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 10574,
            "upload_time": "2024-07-31T09:15:10",
            "upload_time_iso_8601": "2024-07-31T09:15:10.415403Z",
            "url": "https://files.pythonhosted.org/packages/58/be/7ddc8b5d79ce0ad0088a5254fb0781d4d824617fa725a37437b96223f80a/gpt4all_cli-1.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-31 09:15:10",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "sergei-mironov",
    "github_project": "gpt4all-cli",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "gpt4all-cli"
}
        
Elapsed time: 0.47662s