chap


Namechap JSON
Version 0.8.4 PyPI version JSON
download
home_pageNone
SummaryInteract with the OpenAI ChatGPT API (and other text generators)
upload_time2024-10-22 12:58:45
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseNone
keywords llm tui chatgpt
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <!--
SPDX-FileCopyrightText: 2021 Jeff Epler

SPDX-License-Identifier: MIT
-->
[![Test](https://github.com/jepler/chap/actions/workflows/test.yml/badge.svg)](https://github.com/jepler/chap/actions/workflows/test.yml)
[![Release chap](https://github.com/jepler/chap/actions/workflows/release.yml/badge.svg?event=release)](https://github.com/jepler/chap/actions/workflows/release.yml)
[![PyPI](https://img.shields.io/pypi/v/chap)](https://pypi.org/project/chap/)

# chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)

![Chap screencast](https://raw.githubusercontent.com/jepler/chap/main/chap.gif)

## System requirements

Chap is primarily developed on Linux with Python 3.11. Moderate effort will be made to support versions back to Python 3.9 (Debian oldstable).

## Installation

If you want `chap` available as a command, just install with  `pipx install chap` or `pip install chap`.

Use a virtual environment unless you want it installed globally.

## Installation for development

Use one of the following two methods to run `chap` as a command, with the ability to edit the source files. You are welcome to submit valuable changes as [a pull request](https://github.com/jepler/chap/pulls).

### Via `pip install --editable .`

This is an "editable install", as [recommended by the Python Packaging Authority](https://setuptools.pypa.io/en/latest/userguide/development_mode.html).

Change directory to the root of the `chap` project.

Activate your virtual environment, then install `chap` in development mode:

```shell
pip install --editable .
```

In this mode, you get the `chap` command-line program installed, but you are able to edit the source files in the `src` directory in place.

### Via `chap-dev.py`

A simple shim script called `chap-dev.py` is included to demonstrate how to load and run the `chap` library without installing `chap` in development mode. This method may be more familiar to some developers.

Change directory to the root of the `chap` project.

Activate your virtual environment, then install requirements:

```shell
pip install -r requirements.txt
```

Run the shim script (with optional command flags as appropriate):

```shell
./chap-dev.py
```

In this mode, you can edit the source files in the `src` directory in place, and the shim script will pick up the changes via the `import` directive.

## Contributing

See [CONTRIBUTING.md](CONTRIBUTING.md).

## Code of Conduct

See [CODE\_OF\_CONDUCT.md](CODE_OF_CONDUCT.md).

## Configuration

Put your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at `~/.config/chap/openai_api_key`

## Command-line usage

 * `chap ask "What advice would you give a 20th century human visiting the 21st century for the first time?"`

 * `chap render --last` / `chap cat --last`

 * `chap import chatgpt-style-chatlog.json` (for files from pionxzh/chatgpt-exporter)

 * `chap grep needle`

## `@FILE` arguments

It's useful to set a bunch of related arguments together, for instance to fully
configure a back-end. This functionality is implemented via `@FILE` arguments.

Before any other command-line argument parsing is performed, `@FILE` arguments are expanded:

 * An `@FILE` argument is searched relative to the current directory
 * An `@:FILE` argument is searched relative to the configuration directory (e.g., $HOME/.config/chap)
 * If an argument starts with a literal `@`, double it: `@@`
 * `@.` stops processing any further `@FILE` arguments and leaves them unchanged.
The contents of an `@FILE` are parsed according to `shlex.split(comments=True)`.
Comments are supported.
A typical content might look like this:
```
# gpt-3.5.txt: Use cheaper gpt 3.5 and custom prompt
--backend openai-chatgpt
-B model:gpt-3.5-turbo
-s my-custom-system-message.txt
```
and you might use it with
```
chap @:gpt-3.5.txt ask what version of gpt is this
```

## Interactive terminal usage
The interactive terminal mode is accessed via `chap tui`.

There are a variety of keyboard shortcuts to be aware of:
 * tab/shift-tab to move between the entry field and the conversation, or between conversation items
 * While in the text box, F9 or (if supported by your terminal) alt+enter to submit multiline text
 * while on a conversation item:
   * ctrl+x to re-draft the message. This
     * saves a copy of the session in an auto-named file in the conversations folder
     * removes the conversation from this message to the end
     * puts the user's message in the text box to edit
   * ctrl+x to re-submit the message. This
     * saves a copy of the session in an auto-named file in the conversations folder
     * removes the conversation from this message to the end
     * puts the user's message in the text box
     * and submits it immediately
   * ctrl+y to yank the message. This places the response part of the current
     interaction in the operating system clipboard to be pasted (e..g, with
     ctrl+v or command+v in other software)
   * ctrl+q to toggle whether this message may be included in the contextual history for a future query.
     The exact way history is submitted is determined by the back-end, often by
     counting messages or tokens, but the ctrl+q toggle ensures this message (both the user
     and assistant message parts) are not considered.

## Sessions & Command-line Parameters

Details of session handling & command-line arguments are in flux.

By default, a new session is created. It is saved to the user's state directory
(e.g., `~/.local/state/chap` on linux/unix systems).

You can specify the session filename for a new session with `-n` or to re-open
an existing session with `-s`. Or, you can continue the last session with
`--last`.

You can set the "system message" with the `-S` flag.

You can select the text generating backend with the `-b` flag:
 * openai-chatgpt: the default, paid API, best quality results
 * llama-cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,
 though it is [optimized for models that use the llama2-style prompting](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).
 Set the server URL with `-B url:...`.
 * textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models.
 Needs the server URL in *$configuration_directory/textgen\_url*.
 * lorem: local non-AI lorem generator for testing

## Environment variables

The backend can be set with the `CHAP_BACKEND` environment variable.

Backend settings can be set with `CHAP_<backend_name>_<parameter_name>`, with `backend_name` and `parameter_name` all in caps.

For instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama-cpp back-end.

## Importing from ChatGPT

The userscript https://github.com/pionxzh/chatgpt-exporter can export chat logs from chat.openai.com in a JSON format.

This format is different than chap's, especially since `chap` currently only represents a single branch of conversation in one log.

You can use the `chap import` command to import all the branches of a chatgpt-style chatlog in JSON format into a series of `chap`-style chat logs.

## Plug-ins

Chap supports back-end and command plug-ins.

"Back-ends" add additional text generators.

"Commands" add new ways to interact with text generators, session data, and so forth.

Install a plugin with `pip install` or `pipx inject` (depending how you installed chap) and then use it as normal.

[chap-backend-replay](https://pypi.org/project/chap-backend-replay/) is an example back-end plug-in. It replays answers from a previous session.

[chap-command-explain](https://pypi.org/project/chap-command-explain/) is an example command plug-in. It is similar to `chap ask`.

At this time, there is no stability guarantee for the API of commands or backends.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "chap",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "llm, tui, chatgpt",
    "author": null,
    "author_email": "Jeff Epler <jepler@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/6c/0e/763e0aa1347d808aae1b508c6af049db7717a37a2707444f3d0d4359b8cb/chap-0.8.4.tar.gz",
    "platform": null,
    "description": "<!--\nSPDX-FileCopyrightText: 2021 Jeff Epler\n\nSPDX-License-Identifier: MIT\n-->\n[![Test](https://github.com/jepler/chap/actions/workflows/test.yml/badge.svg)](https://github.com/jepler/chap/actions/workflows/test.yml)\n[![Release chap](https://github.com/jepler/chap/actions/workflows/release.yml/badge.svg?event=release)](https://github.com/jepler/chap/actions/workflows/release.yml)\n[![PyPI](https://img.shields.io/pypi/v/chap)](https://pypi.org/project/chap/)\n\n# chap - A Python interface to chatgpt and other LLMs, including a terminal user interface (tui)\n\n![Chap screencast](https://raw.githubusercontent.com/jepler/chap/main/chap.gif)\n\n## System requirements\n\nChap is primarily developed on Linux with Python 3.11. Moderate effort will be made to support versions back to Python 3.9 (Debian oldstable).\n\n## Installation\n\nIf you want `chap` available as a command, just install with  `pipx install chap` or `pip install chap`.\n\nUse a virtual environment unless you want it installed globally.\n\n## Installation for development\n\nUse one of the following two methods to run `chap` as a command, with the ability to edit the source files. You are welcome to submit valuable changes as [a pull request](https://github.com/jepler/chap/pulls).\n\n### Via `pip install --editable .`\n\nThis is an \"editable install\", as [recommended by the Python Packaging Authority](https://setuptools.pypa.io/en/latest/userguide/development_mode.html).\n\nChange directory to the root of the `chap` project.\n\nActivate your virtual environment, then install `chap` in development mode:\n\n```shell\npip install --editable .\n```\n\nIn this mode, you get the `chap` command-line program installed, but you are able to edit the source files in the `src` directory in place.\n\n### Via `chap-dev.py`\n\nA simple shim script called `chap-dev.py` is included to demonstrate how to load and run the `chap` library without installing `chap` in development mode. This method may be more familiar to some developers.\n\nChange directory to the root of the `chap` project.\n\nActivate your virtual environment, then install requirements:\n\n```shell\npip install -r requirements.txt\n```\n\nRun the shim script (with optional command flags as appropriate):\n\n```shell\n./chap-dev.py\n```\n\nIn this mode, you can edit the source files in the `src` directory in place, and the shim script will pick up the changes via the `import` directive.\n\n## Contributing\n\nSee [CONTRIBUTING.md](CONTRIBUTING.md).\n\n## Code of Conduct\n\nSee [CODE\\_OF\\_CONDUCT.md](CODE_OF_CONDUCT.md).\n\n## Configuration\n\nPut your OpenAI API key in the platform configuration directory for chap, e.g., on linux/unix systems at `~/.config/chap/openai_api_key`\n\n## Command-line usage\n\n * `chap ask \"What advice would you give a 20th century human visiting the 21st century for the first time?\"`\n\n * `chap render --last` / `chap cat --last`\n\n * `chap import chatgpt-style-chatlog.json` (for files from pionxzh/chatgpt-exporter)\n\n * `chap grep needle`\n\n## `@FILE` arguments\n\nIt's useful to set a bunch of related arguments together, for instance to fully\nconfigure a back-end. This functionality is implemented via `@FILE` arguments.\n\nBefore any other command-line argument parsing is performed, `@FILE` arguments are expanded:\n\n * An `@FILE` argument is searched relative to the current directory\n * An `@:FILE` argument is searched relative to the configuration directory (e.g., $HOME/.config/chap)\n * If an argument starts with a literal `@`, double it: `@@`\n * `@.` stops processing any further `@FILE` arguments and leaves them unchanged.\nThe contents of an `@FILE` are parsed according to `shlex.split(comments=True)`.\nComments are supported.\nA typical content might look like this:\n```\n# gpt-3.5.txt: Use cheaper gpt 3.5 and custom prompt\n--backend openai-chatgpt\n-B model:gpt-3.5-turbo\n-s my-custom-system-message.txt\n```\nand you might use it with\n```\nchap @:gpt-3.5.txt ask what version of gpt is this\n```\n\n## Interactive terminal usage\nThe interactive terminal mode is accessed via `chap tui`.\n\nThere are a variety of keyboard shortcuts to be aware of:\n * tab/shift-tab to move between the entry field and the conversation, or between conversation items\n * While in the text box, F9 or (if supported by your terminal) alt+enter to submit multiline text\n * while on a conversation item:\n   * ctrl+x to re-draft the message. This\n     * saves a copy of the session in an auto-named file in the conversations folder\n     * removes the conversation from this message to the end\n     * puts the user's message in the text box to edit\n   * ctrl+x to re-submit the message. This\n     * saves a copy of the session in an auto-named file in the conversations folder\n     * removes the conversation from this message to the end\n     * puts the user's message in the text box\n     * and submits it immediately\n   * ctrl+y to yank the message. This places the response part of the current\n     interaction in the operating system clipboard to be pasted (e..g, with\n     ctrl+v or command+v in other software)\n   * ctrl+q to toggle whether this message may be included in the contextual history for a future query.\n     The exact way history is submitted is determined by the back-end, often by\n     counting messages or tokens, but the ctrl+q toggle ensures this message (both the user\n     and assistant message parts) are not considered.\n\n## Sessions & Command-line Parameters\n\nDetails of session handling & command-line arguments are in flux.\n\nBy default, a new session is created. It is saved to the user's state directory\n(e.g., `~/.local/state/chap` on linux/unix systems).\n\nYou can specify the session filename for a new session with `-n` or to re-open\nan existing session with `-s`. Or, you can continue the last session with\n`--last`.\n\nYou can set the \"system message\" with the `-S` flag.\n\nYou can select the text generating backend with the `-b` flag:\n * openai-chatgpt: the default, paid API, best quality results\n * llama-cpp: Works with [llama.cpp's http server](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) and can run locally with various models,\n though it is [optimized for models that use the llama2-style prompting](https://huggingface.co/blog/llama2#how-to-prompt-llama-2).\n Set the server URL with `-B url:...`.\n * textgen: Works with https://github.com/oobabooga/text-generation-webui and can run locally with various models.\n Needs the server URL in *$configuration_directory/textgen\\_url*.\n * lorem: local non-AI lorem generator for testing\n\n## Environment variables\n\nThe backend can be set with the `CHAP_BACKEND` environment variable.\n\nBackend settings can be set with `CHAP_<backend_name>_<parameter_name>`, with `backend_name` and `parameter_name` all in caps.\n\nFor instance, `CHAP_LLAMA_CPP_URL=http://server.local:8080/completion` changes the default server URL for the llama-cpp back-end.\n\n## Importing from ChatGPT\n\nThe userscript https://github.com/pionxzh/chatgpt-exporter can export chat logs from chat.openai.com in a JSON format.\n\nThis format is different than chap's, especially since `chap` currently only represents a single branch of conversation in one log.\n\nYou can use the `chap import` command to import all the branches of a chatgpt-style chatlog in JSON format into a series of `chap`-style chat logs.\n\n## Plug-ins\n\nChap supports back-end and command plug-ins.\n\n\"Back-ends\" add additional text generators.\n\n\"Commands\" add new ways to interact with text generators, session data, and so forth.\n\nInstall a plugin with `pip install` or `pipx inject` (depending how you installed chap) and then use it as normal.\n\n[chap-backend-replay](https://pypi.org/project/chap-backend-replay/) is an example back-end plug-in. It replays answers from a previous session.\n\n[chap-command-explain](https://pypi.org/project/chap-command-explain/) is an example command plug-in. It is similar to `chap ask`.\n\nAt this time, there is no stability guarantee for the API of commands or backends.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Interact with the OpenAI ChatGPT API (and other text generators)",
    "version": "0.8.4",
    "project_urls": {
        "homepage": "https://github.com/jepler/chap",
        "repository": "https://github.com/jepler/chap"
    },
    "split_keywords": [
        "llm",
        " tui",
        " chatgpt"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3b01bd98ed0fd6faf8a583941beeba89df06b0975eff85a1641c6710cc279cf8",
                "md5": "d1ef547da16d7ec932fc6c5f69b295a7",
                "sha256": "b6ba8a4ea480d97ac760fac19f27bd2155d7fa5b70812e40c9e0b4298aa86713"
            },
            "downloads": -1,
            "filename": "chap-0.8.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d1ef547da16d7ec932fc6c5f69b295a7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 31402,
            "upload_time": "2024-10-22T12:58:44",
            "upload_time_iso_8601": "2024-10-22T12:58:44.945727Z",
            "url": "https://files.pythonhosted.org/packages/3b/01/bd98ed0fd6faf8a583941beeba89df06b0975eff85a1641c6710cc279cf8/chap-0.8.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6c0e763e0aa1347d808aae1b508c6af049db7717a37a2707444f3d0d4359b8cb",
                "md5": "8433064276facaf3bb91baf67f487c60",
                "sha256": "45888d2435f94337864950bddc650e3b42903686dc036d11752307fc7f43599a"
            },
            "downloads": -1,
            "filename": "chap-0.8.4.tar.gz",
            "has_sig": false,
            "md5_digest": "8433064276facaf3bb91baf67f487c60",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 365846,
            "upload_time": "2024-10-22T12:58:45",
            "upload_time_iso_8601": "2024-10-22T12:58:45.993896Z",
            "url": "https://files.pythonhosted.org/packages/6c/0e/763e0aa1347d808aae1b508c6af049db7717a37a2707444f3d0d4359b8cb/chap-0.8.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-22 12:58:45",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "jepler",
    "github_project": "chap",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "chap"
}
        
Elapsed time: 0.40061s