hcli-hg


Namehcli-hg JSON
Version 0.1.4 PyPI version JSON
download
home_pagehttps://github.com/cometaj2/hcli_hg
SummaryHCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.
upload_time2024-01-16 01:41:51
maintainer
docs_urlNone
authorJeff Michaud
requires_python
licenseMIT
keywords cli client server connector hypermedia rest generic development
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            |pypi|_ |build status|_ |pyver|_

HCLI hg
=======

HCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.

----

HCLI hg wraps hg (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].

You can find out more about HCLI on hcli.io [3]

[1] https://github.com/cometaj2/huckle

[2] https://github.com/cometaj2/hcli_core

[3] http://hcli.io

Installation
------------

HCLI hc requires a supported version of Python and pip.

You'll need an HCLI Connector to run hc. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).


.. code-block:: console

    pip install hcli-hg
    pip install hcli-core
    pip install huckle
    pip install gunicorn
    gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 --chdir `hcli_core path` "hcli_core:connector(\"`hcli_hg path`\")"

Usage
-----

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hc).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving *like* the CLI it targets.


.. code-block:: console

    huckle cli install http://127.0.0.1:8000
    eval $(huckle env)
    hg help

Versioning
----------
    
This project makes use of semantic versioning (http://semver.org) and may make use of the "devx",
"prealphax", "alphax" "betax", and "rcx" extensions where x is a number (e.g. 0.3.0-prealpha1)
on github.

Supports
--------

- Chatting by sending command line input streams (e.g. via pipes).
- Getting and setting a context to setup a new conversation or to save a conversation.
- Behavior setting to allow for persistent chatbot's behvior (e.g. the Do Anything Now (DAN) prompt).

To Do
-----

- A memory layer for the GPT-3.5-Turbo HCLI (hg).
    - Automatic context switching per NLP on received input stream.
    - Context blending to mary different contexts.
    - Automatic context compression to yield a more substantial memory footprint per context window.
- Additional commands to better save and restore conversations/contexts.
- A shell mode for the GPT-3.5-Turbo HCLI (hg) to enable shell CLI execution per sought goal.

Bugs
----

N/A

.. |build status| image:: https://circleci.com/gh/cometaj2/hcli_hg.svg?style=shield
.. _build status: https://circleci.com/gh/cometaj2/hcli_hg
.. |pypi| image:: https://img.shields.io/pypi/v/hcli-hg?label=hcli-hg
.. _pypi: https://pypi.org/project/hcli-hg
.. |pyver| image:: https://img.shields.io/pypi/pyversions/hcli-hg.svg
.. _pyver: https://pypi.org/project/hcli-hg

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/cometaj2/hcli_hg",
    "name": "hcli-hg",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "cli client server connector hypermedia rest generic development",
    "author": "Jeff Michaud",
    "author_email": "cometaj2@comcast.net",
    "download_url": "https://files.pythonhosted.org/packages/3c/90/d78cedb958d59797647c86828ac698b60c77931f9e04b8ae831018ccc412/hcli_hg-0.1.4.tar.gz",
    "platform": null,
    "description": "|pypi|_ |build status|_ |pyver|_\n\nHCLI hg\n=======\n\nHCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.\n\n----\n\nHCLI hg wraps hg (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].\n\nYou can find out more about HCLI on hcli.io [3]\n\n[1] https://github.com/cometaj2/huckle\n\n[2] https://github.com/cometaj2/hcli_core\n\n[3] http://hcli.io\n\nInstallation\n------------\n\nHCLI hc requires a supported version of Python and pip.\n\nYou'll need an HCLI Connector to run hc. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).\n\n\n.. code-block:: console\n\n    pip install hcli-hg\n    pip install hcli-core\n    pip install huckle\n    pip install gunicorn\n    gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 --chdir `hcli_core path` \"hcli_core:connector(\\\"`hcli_hg path`\\\")\"\n\nUsage\n-----\n\nOpen a different shell window.\n\nSetup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hc).\n\nNote that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving *like* the CLI it targets.\n\n\n.. code-block:: console\n\n    huckle cli install http://127.0.0.1:8000\n    eval $(huckle env)\n    hg help\n\nVersioning\n----------\n    \nThis project makes use of semantic versioning (http://semver.org) and may make use of the \"devx\",\n\"prealphax\", \"alphax\" \"betax\", and \"rcx\" extensions where x is a number (e.g. 0.3.0-prealpha1)\non github.\n\nSupports\n--------\n\n- Chatting by sending command line input streams (e.g. via pipes).\n- Getting and setting a context to setup a new conversation or to save a conversation.\n- Behavior setting to allow for persistent chatbot's behvior (e.g. the Do Anything Now (DAN) prompt).\n\nTo Do\n-----\n\n- A memory layer for the GPT-3.5-Turbo HCLI (hg).\n    - Automatic context switching per NLP on received input stream.\n    - Context blending to mary different contexts.\n    - Automatic context compression to yield a more substantial memory footprint per context window.\n- Additional commands to better save and restore conversations/contexts.\n- A shell mode for the GPT-3.5-Turbo HCLI (hg) to enable shell CLI execution per sought goal.\n\nBugs\n----\n\nN/A\n\n.. |build status| image:: https://circleci.com/gh/cometaj2/hcli_hg.svg?style=shield\n.. _build status: https://circleci.com/gh/cometaj2/hcli_hg\n.. |pypi| image:: https://img.shields.io/pypi/v/hcli-hg?label=hcli-hg\n.. _pypi: https://pypi.org/project/hcli-hg\n.. |pyver| image:: https://img.shields.io/pypi/pyversions/hcli-hg.svg\n.. _pyver: https://pypi.org/project/hcli-hg\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "HCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.",
    "version": "0.1.4",
    "project_urls": {
        "Homepage": "https://github.com/cometaj2/hcli_hg"
    },
    "split_keywords": [
        "cli",
        "client",
        "server",
        "connector",
        "hypermedia",
        "rest",
        "generic",
        "development"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "930ff8491d345151a9db7b2202ad903c583d0e76cc58c7f44b6f54476db437ec",
                "md5": "32a916893649c718d90363ffabcd4005",
                "sha256": "cfe4bb5acc4206a0020fcca908109719f89568ea805f33842c8a8252364131c5"
            },
            "downloads": -1,
            "filename": "hcli_hg-0.1.4-py2.py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "32a916893649c718d90363ffabcd4005",
            "packagetype": "bdist_wheel",
            "python_version": "py2.py3",
            "requires_python": null,
            "size": 18523,
            "upload_time": "2024-01-16T01:41:49",
            "upload_time_iso_8601": "2024-01-16T01:41:49.855837Z",
            "url": "https://files.pythonhosted.org/packages/93/0f/f8491d345151a9db7b2202ad903c583d0e76cc58c7f44b6f54476db437ec/hcli_hg-0.1.4-py2.py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3c90d78cedb958d59797647c86828ac698b60c77931f9e04b8ae831018ccc412",
                "md5": "e519a4cf055d408f2c8083da78143e5f",
                "sha256": "fa244b1197970ebca68481dc48186a00b01a82c85e503f30bb3036159f8aecd7"
            },
            "downloads": -1,
            "filename": "hcli_hg-0.1.4.tar.gz",
            "has_sig": false,
            "md5_digest": "e519a4cf055d408f2c8083da78143e5f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 17702,
            "upload_time": "2024-01-16T01:41:51",
            "upload_time_iso_8601": "2024-01-16T01:41:51.607481Z",
            "url": "https://files.pythonhosted.org/packages/3c/90/d78cedb958d59797647c86828ac698b60c77931f9e04b8ae831018ccc412/hcli_hg-0.1.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-01-16 01:41:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "cometaj2",
    "github_project": "hcli_hg",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "circle": true,
    "lcname": "hcli-hg"
}
        
Elapsed time: 0.16327s