aidotpy


Nameaidotpy JSON
Version 0.3.0 PyPI version JSON
download
home_page
Summary
upload_time2023-03-26 17:25:41
maintainer
docs_urlNone
author
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ai.py

A single-file Python script that interacts with ChatGPT API in the command-line.

![](images/screenshot-repl.png)

Features:
- Use shortcuts to access predefined prompts
- Highlight code in output
- Support one-shot queries and conversations
- Use special command like `!set` to control the behavior when chatting


## Install

Just copy the script to a folder in `$PATH`, like `/usr/local/bin`. You can also change its name to `ai` to get ride of the `.py` extension.

Here's a command that can directly install the script into your system:

```
curl https://raw.githubusercontent.com/reorx/ai.py/master/ai.py -o /usr/local/bin/ai && chmod +x /usr/local/bin/ai
```

You can also install it with pip or pipx:

```
pip install aidotpy
```

## Usage

Paste your OpenAI API key to `~/.ai_py_config.json`, or set it in `AI_PY_API_KEY` environment variable.

```bash
echo '{"api_key":"<Your API key>"}' > ~/.ai_py_config.json
```

For detail usage of the script, please read the description of `./ai.py -h`:

```
usage: ai [-h] [-s SYSTEM] [-c] [--history HISTORY] [-w] [-v] [-t] [-d]
          [--version]
          [PROMPT]

A simple CLI for ChatGPT API

positional arguments:
  PROMPT                your prompt, leave it empty to run REPL. you can use @
                        to load prompt from ~/.ai_py_prompts.json

options:
  -h, --help            show this help message and exit
  -s SYSTEM, --system SYSTEM
                        system message to use at the beginning of the
                        conversation. if starts with @, the message will be
                        located through ~/.ai_py_prompts.json
  -c, --conversation    enable conversation, which means all the messages will
                        be sent to the API, not just the last one. This is
                        only useful to REPL
  --history HISTORY     load the history from a JSON file.
  -w, --write-history   write new messages to --history file after each chat.
  -v, --verbose         verbose mode, show execution info and role in the
                        message
  -t, --show-tokens     show a breakdown of the tokens used in the prompt and
                        in the response
  -d, --debug           debug mode, enable logging
  --version             show program's version number and exit
```

### One-off query

Pass the prompt as the first argument:

```
./ai.py 'hello world'
```

You can also pass the prompt through a pipe (`|`):

```
head README.md | ./ai.py 'Proofreading the following text:'
```

### REPL

Run without argument for [Read–eval–print loop](https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop):

```
./ai.py
```

By default only the last message and the system message are sent to the API,
if you want it to remember all the context (i.e. send all the messages in each chat),
add `-c` argument to enable conversation:

```
./ai.py -c
```

### System message

You can pass a system message to define the behavior for the assistant:

```
./ai.py -s 'You are a proofreader' 'its nice know you'
```

You can also save your predefined system messages in `~/.ai_py_promots.json`
and refer them with `@` at the beginning, this will be covered in the next section.


### Prompt shortcuts

You can predefine prompts in `~/.ai_py_prompts.json` and refer to them by using `@` as a prefix.
This works for both system messages and user messages.

Suppose your `~/.ai_py_prompts.json` looks like this:

```json
{
  "system": {
    "cli": "As a technology assistant with expertise in command line, answer questions in simple and short words for users who have a high-level background. Provide only one example, and explain as less as possible."
  },
  "user": {
    "native": "Paraphrase the following sentences to make it more native:\n",
    "revise": "Revise the following sentences to make them more clear concise and coherent:\n",
    "": ""
  }
}
```

Then you can use the `cli` prompt shortcut in system message by:

```
./ai.py -s @cli
```

and use the `native` or `revise` prompt shortcut in user message by:

```
./ai.py '@native its nice know you'

It's great to get to know you.
```

### Verbose mode

Add `-v` to print role name and parameters used in the API call.

<details>
  <summary>Screenshot</summary>

  ![](images/screenshot-repl-verbose.png)
</details>

### Special commands

You can use special commands to control the behavior of the script when running in REPL.

Here's a list of available commands:

- `!set <key> <value>`: set a key-value pair in the config, available keys are:
  - `verbose`: set to `True` or `False`, e.g. `!set verbose True`
  - `conversation`: set to `True` or `False`, e.g. `!set conversation True`
  - `system`: set the system message. e.g. `!set system you are a poet`, `!set system @cli`
  - `params`: set the parmeters for the ChatGPT API. e.g. `!set params temperature 0.5`
  - `model`: set the model to use. e.g. `!set model gpt-4`
- `!info`: print the execution info
- `!write-history`: write current messages to history file. e.g. `!write-history history.json`

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "aidotpy",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "",
    "download_url": "",
    "platform": null,
    "description": "# ai.py\n\nA single-file Python script that interacts with ChatGPT API in the command-line.\n\n![](images/screenshot-repl.png)\n\nFeatures:\n- Use shortcuts to access predefined prompts\n- Highlight code in output\n- Support one-shot queries and conversations\n- Use special command like `!set` to control the behavior when chatting\n\n\n## Install\n\nJust copy the script to a folder in `$PATH`, like `/usr/local/bin`. You can also change its name to `ai` to get ride of the `.py` extension.\n\nHere's a command that can directly install the script into your system:\n\n```\ncurl https://raw.githubusercontent.com/reorx/ai.py/master/ai.py -o /usr/local/bin/ai && chmod +x /usr/local/bin/ai\n```\n\nYou can also install it with pip or pipx:\n\n```\npip install aidotpy\n```\n\n## Usage\n\nPaste your OpenAI API key to `~/.ai_py_config.json`, or set it in `AI_PY_API_KEY` environment variable.\n\n```bash\necho '{\"api_key\":\"<Your API key>\"}' > ~/.ai_py_config.json\n```\n\nFor detail usage of the script, please read the description of `./ai.py -h`:\n\n```\nusage: ai [-h] [-s SYSTEM] [-c] [--history HISTORY] [-w] [-v] [-t] [-d]\n          [--version]\n          [PROMPT]\n\nA simple CLI for ChatGPT API\n\npositional arguments:\n  PROMPT                your prompt, leave it empty to run REPL. you can use @\n                        to load prompt from ~/.ai_py_prompts.json\n\noptions:\n  -h, --help            show this help message and exit\n  -s SYSTEM, --system SYSTEM\n                        system message to use at the beginning of the\n                        conversation. if starts with @, the message will be\n                        located through ~/.ai_py_prompts.json\n  -c, --conversation    enable conversation, which means all the messages will\n                        be sent to the API, not just the last one. This is\n                        only useful to REPL\n  --history HISTORY     load the history from a JSON file.\n  -w, --write-history   write new messages to --history file after each chat.\n  -v, --verbose         verbose mode, show execution info and role in the\n                        message\n  -t, --show-tokens     show a breakdown of the tokens used in the prompt and\n                        in the response\n  -d, --debug           debug mode, enable logging\n  --version             show program's version number and exit\n```\n\n### One-off query\n\nPass the prompt as the first argument:\n\n```\n./ai.py 'hello world'\n```\n\nYou can also pass the prompt through a pipe (`|`):\n\n```\nhead README.md | ./ai.py 'Proofreading the following text:'\n```\n\n### REPL\n\nRun without argument for [Read\u2013eval\u2013print loop](https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop):\n\n```\n./ai.py\n```\n\nBy default only the last message and the system message are sent to the API,\nif you want it to remember all the context (i.e. send all the messages in each chat),\nadd `-c` argument to enable conversation:\n\n```\n./ai.py -c\n```\n\n### System message\n\nYou can pass a system message to define the behavior for the assistant:\n\n```\n./ai.py -s 'You are a proofreader' 'its nice know you'\n```\n\nYou can also save your predefined system messages in `~/.ai_py_promots.json`\nand refer them with `@` at the beginning, this will be covered in the next section.\n\n\n### Prompt shortcuts\n\nYou can predefine prompts in `~/.ai_py_prompts.json` and refer to them by using `@` as a prefix.\nThis works for both system messages and user messages.\n\nSuppose your `~/.ai_py_prompts.json` looks like this:\n\n```json\n{\n  \"system\": {\n    \"cli\": \"As a technology assistant with expertise in command line, answer questions in simple and short words for users who have a high-level background. Provide only one example, and explain as less as possible.\"\n  },\n  \"user\": {\n    \"native\": \"Paraphrase the following sentences to make it more native:\\n\",\n    \"revise\": \"Revise the following sentences to make them more clear concise and coherent:\\n\",\n    \"\": \"\"\n  }\n}\n```\n\nThen you can use the `cli` prompt shortcut in system message by:\n\n```\n./ai.py -s @cli\n```\n\nand use the `native` or `revise` prompt shortcut in user message by:\n\n```\n./ai.py '@native its nice know you'\n\nIt's great to get to know you.\n```\n\n### Verbose mode\n\nAdd `-v` to print role name and parameters used in the API call.\n\n<details>\n  <summary>Screenshot</summary>\n\n  ![](images/screenshot-repl-verbose.png)\n</details>\n\n### Special commands\n\nYou can use special commands to control the behavior of the script when running in REPL.\n\nHere's a list of available commands:\n\n- `!set <key> <value>`: set a key-value pair in the config, available keys are:\n  - `verbose`: set to `True` or `False`, e.g. `!set verbose True`\n  - `conversation`: set to `True` or `False`, e.g. `!set conversation True`\n  - `system`: set the system message. e.g. `!set system you are a poet`, `!set system @cli`\n  - `params`: set the parmeters for the ChatGPT API. e.g. `!set params temperature 0.5`\n  - `model`: set the model to use. e.g. `!set model gpt-4`\n- `!info`: print the execution info\n- `!write-history`: write current messages to history file. e.g. `!write-history history.json`\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "",
    "version": "0.3.0",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0075d5e18ad6efc37c048e359de4c39f17ae679f803394da9f7cec7bf3de0dbc",
                "md5": "3a4538d7fc88a10d917e9e26b3233bdb",
                "sha256": "3fe642b98469d2bcd3e724fefd51dcb36c4a7ff2bf3e6b0286dde62fcacfa267"
            },
            "downloads": -1,
            "filename": "aidotpy-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3a4538d7fc88a10d917e9e26b3233bdb",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 9500,
            "upload_time": "2023-03-26T17:25:41",
            "upload_time_iso_8601": "2023-03-26T17:25:41.489788Z",
            "url": "https://files.pythonhosted.org/packages/00/75/d5e18ad6efc37c048e359de4c39f17ae679f803394da9f7cec7bf3de0dbc/aidotpy-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-26 17:25:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "aidotpy"
}
        
Elapsed time: 0.05386s