A (yet another) GNU Readline-based application for interaction with chat-oriented AI models.
Supported model providers:
* [OpenAI](https://www.openai.com) via REST API
* [GPT4All](https://www.nomic.ai/gpt4all) via Python bindings
Contents
--------
<!-- vim-markdown-toc GFM -->
* [Install](#install)
* [Pip](#pip)
* [Nix](#nix)
* [Usage](#usage)
* [Example session](#example-session)
* [Vim integration](#vim-integration)
<!-- vim-markdown-toc -->
Install
-------
The following installation options are available:
### Pip
```sh
$ pip install sm_aicli
```
### Nix
```sh
$ git clone --depth=1 https://github.com/sergei-mironov/aicli && cd aicli
# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable
$ nix profile install ".#python-aicli"
```
Usage
-----
<!--
``` python
!aicli --help
```
-->
``` result
usage: aicli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]
[--num-threads NUM_THREADS] [--model-apikey STR]
[--model-temperature MODEL_TEMPERATURE] [--device DEVICE]
[--readline-key-send READLINE_KEY_SEND]
[--readline-prompt READLINE_PROMPT] [--readline-history FILE]
[--verbose NUM] [--revision] [--version]
Command-line arguments
options:
-h, --help show this help message and exit
--model-dir MODEL_DIR
Model directory to prepend to model file names
--model [STR1:]STR2, -m [STR1:]STR2
Model to use. STR1 is 'gpt4all' (the default) or
'openai'. STR2 is the model name
--num-threads NUM_THREADS, -t NUM_THREADS
Number of threads to use
--model-apikey STR Model provider-specific API key
--model-temperature MODEL_TEMPERATURE
Temperature parameter of the model
--device DEVICE, -d DEVICE
Device to use for chatbot, e.g. gpu, amd, nvidia,
intel. Defaults to CPU
--readline-key-send READLINE_KEY_SEND
Terminal code to treat as Ctrl+Enter (default: \C-k)
--readline-prompt READLINE_PROMPT
Input prompt (default: >>>)
--readline-history FILE
History file name (default is '_sm_aicli_history'; set
empty to disable)
--verbose NUM Set the verbosity level 0-no,1-full
--revision Print the revision
--version Print the version
```
The console accepts language defined by the following grammar:
<!--
``` python
from sm_aicli import GRAMMAR
from textwrap import dedent
print(dedent(GRAMMAR).strip())
```
-->
``` result
start: (command | escape | text)? (command | escape | text)*
escape.3: /\\./
command.2: /\/ask|\/exit|\/help|\/reset/ | \
/\/model/ / +/ (/"/ model_string /"/ | /"/ /"/) | \
/\/apikey/ / +/ (/"/ apikey_string /"/ | /"/ /"/) | \
/\/nthreads/ / +/ (number | def) | \
/\/verbose/ / +/ (number | def) | \
/\/temp/ / +/ (float | def ) | \
/\/echo/ | /\/echo/ / /
model_string: (model_provider ":")? model_name
model_provider: "gpt4all" -> mp_gpt4all | "openai" -> mp_openai | "dummy" -> mp_dummy
model_name: /[^"]+/
apikey_string: (apikey_schema ":")? apikey_value
apikey_schema: "verbatim" -> as_verbatim | "file" -> as_file
apikey_value: /[^"]+/
number: /[0-9]+/
float: /[0-9]+\.[0-9]*/
def: "default"
text: /(.(?!\/|\\))*./s
```
### Example session
``` sh
$ aicli
```
``` txt
Type /help or a question followed by the /ask command (or by pressing `C-k` key).
>>> /model "~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf"
>>> Hi!
>>> /ask
Hello! I'm happy to help you. What's on your mind?^C
>>> What's your name?
>>> /ask
I don't really have a personal name, but you can call me "Assistant"
```
Vim integration
---------------
Aicli is supported by the [Litrepl](https://github.com/sergei-mironov/litrepl) text processor.

Raw data
{
"_id": null,
"home_page": "https://github.com/sergei-mironov/aicli",
"name": "sm-aicli",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "Sergei Mironov",
"author_email": "sergei.v.mironov@proton.me",
"download_url": "https://files.pythonhosted.org/packages/ee/ec/a67e14fefc45eae4f3971dac6830ce580c8ab4c734252cb1227cc608bf0f/sm_aicli-1.6.3.tar.gz",
"platform": null,
"description": "A (yet another) GNU Readline-based application for interaction with chat-oriented AI models.\n\nSupported model providers:\n\n* [OpenAI](https://www.openai.com) via REST API\n* [GPT4All](https://www.nomic.ai/gpt4all) via Python bindings\n\nContents\n--------\n\n<!-- vim-markdown-toc GFM -->\n\n* [Install](#install)\n * [Pip](#pip)\n * [Nix](#nix)\n* [Usage](#usage)\n * [Example session](#example-session)\n* [Vim integration](#vim-integration)\n\n<!-- vim-markdown-toc -->\n\nInstall\n-------\n\nThe following installation options are available:\n\n### Pip\n\n```sh\n$ pip install sm_aicli\n```\n\n### Nix\n\n```sh\n$ git clone --depth=1 https://github.com/sergei-mironov/aicli && cd aicli\n# Optionally, change the 'nixpkgs' input of the flake.nix to a more suitable\n$ nix profile install \".#python-aicli\"\n```\n\nUsage\n-----\n\n<!--\n``` python\n!aicli --help\n```\n-->\n``` result\nusage: aicli [-h] [--model-dir MODEL_DIR] [--model [STR1:]STR2]\n [--num-threads NUM_THREADS] [--model-apikey STR]\n [--model-temperature MODEL_TEMPERATURE] [--device DEVICE]\n [--readline-key-send READLINE_KEY_SEND]\n [--readline-prompt READLINE_PROMPT] [--readline-history FILE]\n [--verbose NUM] [--revision] [--version]\n\nCommand-line arguments\n\noptions:\n -h, --help show this help message and exit\n --model-dir MODEL_DIR\n Model directory to prepend to model file names\n --model [STR1:]STR2, -m [STR1:]STR2\n Model to use. STR1 is 'gpt4all' (the default) or\n 'openai'. STR2 is the model name\n --num-threads NUM_THREADS, -t NUM_THREADS\n Number of threads to use\n --model-apikey STR Model provider-specific API key\n --model-temperature MODEL_TEMPERATURE\n Temperature parameter of the model\n --device DEVICE, -d DEVICE\n Device to use for chatbot, e.g. gpu, amd, nvidia,\n intel. Defaults to CPU\n --readline-key-send READLINE_KEY_SEND\n Terminal code to treat as Ctrl+Enter (default: \\C-k)\n --readline-prompt READLINE_PROMPT\n Input prompt (default: >>>)\n --readline-history FILE\n History file name (default is '_sm_aicli_history'; set\n empty to disable)\n --verbose NUM Set the verbosity level 0-no,1-full\n --revision Print the revision\n --version Print the version\n```\n\nThe console accepts language defined by the following grammar:\n\n<!--\n``` python\nfrom sm_aicli import GRAMMAR\nfrom textwrap import dedent\nprint(dedent(GRAMMAR).strip())\n```\n-->\n\n``` result\nstart: (command | escape | text)? (command | escape | text)*\nescape.3: /\\\\./\ncommand.2: /\\/ask|\\/exit|\\/help|\\/reset/ | \\\n /\\/model/ / +/ (/\"/ model_string /\"/ | /\"/ /\"/) | \\\n /\\/apikey/ / +/ (/\"/ apikey_string /\"/ | /\"/ /\"/) | \\\n /\\/nthreads/ / +/ (number | def) | \\\n /\\/verbose/ / +/ (number | def) | \\\n /\\/temp/ / +/ (float | def ) | \\\n /\\/echo/ | /\\/echo/ / /\nmodel_string: (model_provider \":\")? model_name\nmodel_provider: \"gpt4all\" -> mp_gpt4all | \"openai\" -> mp_openai | \"dummy\" -> mp_dummy\nmodel_name: /[^\"]+/\napikey_string: (apikey_schema \":\")? apikey_value\napikey_schema: \"verbatim\" -> as_verbatim | \"file\" -> as_file\napikey_value: /[^\"]+/\nnumber: /[0-9]+/\nfloat: /[0-9]+\\.[0-9]*/\ndef: \"default\"\ntext: /(.(?!\\/|\\\\))*./s\n```\n\n### Example session\n\n``` sh\n$ aicli\n```\n``` txt\nType /help or a question followed by the /ask command (or by pressing `C-k` key).\n>>> /model \"~/.local/share/nomic.ai/GPT4All/Meta-Llama-3-8B-Instruct.Q4_0.gguf\"\n>>> Hi!\n>>> /ask\nHello! I'm happy to help you. What's on your mind?^C\n>>> What's your name?\n>>> /ask\nI don't really have a personal name, but you can call me \"Assistant\"\n```\n\nVim integration\n---------------\n\nAicli is supported by the [Litrepl](https://github.com/sergei-mironov/litrepl) text processor.\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Command-line interface for a number of AI models",
"version": "1.6.3",
"project_urls": {
"Homepage": "https://github.com/sergei-mironov/aicli"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "17d2b6ce8af3a978fa435ff665d2d48dcf3ce043cf7de2c48330d62b098e5f27",
"md5": "40edb0174566fac9ae5ab30edc5632d1",
"sha256": "5fa7d6db6b538d354449e3a78ea9f7dd25d8875aab072e7d70cae830b4f6f860"
},
"downloads": -1,
"filename": "sm_aicli-1.6.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "40edb0174566fac9ae5ab30edc5632d1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 11255,
"upload_time": "2024-08-22T07:25:31",
"upload_time_iso_8601": "2024-08-22T07:25:31.126953Z",
"url": "https://files.pythonhosted.org/packages/17/d2/b6ce8af3a978fa435ff665d2d48dcf3ce043cf7de2c48330d62b098e5f27/sm_aicli-1.6.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "eeeca67e14fefc45eae4f3971dac6830ce580c8ab4c734252cb1227cc608bf0f",
"md5": "dcc20f9edfce61d479b53abc38ed0006",
"sha256": "761a33a870e45b1138bd8104d749add2e8e925e99a255987935784cdbc378929"
},
"downloads": -1,
"filename": "sm_aicli-1.6.3.tar.gz",
"has_sig": false,
"md5_digest": "dcc20f9edfce61d479b53abc38ed0006",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 11643,
"upload_time": "2024-08-22T07:25:32",
"upload_time_iso_8601": "2024-08-22T07:25:32.702165Z",
"url": "https://files.pythonhosted.org/packages/ee/ec/a67e14fefc45eae4f3971dac6830ce580c8ab4c734252cb1227cc608bf0f/sm_aicli-1.6.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-22 07:25:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "sergei-mironov",
"github_project": "aicli",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "sm-aicli"
}