|pypi| |build status| |pyver|
HCLI hai
========
HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude models via terminal input and output streams.
----
HCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].
You can find out more about HCLI on hcli.io [3]
[1] https://github.com/cometaj2/huckle
[2] https://github.com/cometaj2/hcli_core
[3] http://hcli.io
Installation
------------
HCLI hai requires a supported version of Python and pip.
You'll need an HCLI Connector to run hai. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).
.. code-block:: console
pip install hcli-hai
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 "hcli_core:connector(\"`hcli_hai path`\")"
Usage
-----
Open a different shell window.
Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hai).
Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving *like* the CLI it targets.
.. code-block:: console
huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hai help
Versioning
----------
This project makes use of semantic versioning (http://semver.org) and may make use of the "devx",
"prealphax", "alphax" "betax", and "rcx" extensions where x is a number (e.g. 0.3.0-prealpha1)
on github.
Supports
--------
- Chatting via input/output streams (e.g. via pipes).
- .hai folder structure in a users's home directory to help track hai configuration and contexts.
- Creating, listing, deleting and changing conversation contexts.
- Automatic title creation based on context
- Custom context naming to help organize contexts
- Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).
To Do
-----
- A memory layer for the the AI HCLI (hai).
- Automatic context switching per NLP on received input stream.
- Context blending to mary different contexts.
- Automatic context compression to yield a more substantial memory footprint per context window.
- A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.
Bugs
----
N/A
.. |build status| image:: https://circleci.com/gh/cometaj2/hcli_hai.svg?style=shield
:target: https://circleci.com/gh/cometaj2/hcli_hai
.. |pypi| image:: https://img.shields.io/pypi/v/hcli-hai?label=hcli-hai
:target: https://pypi.org/project/hcli-hai
.. |pyver| image:: https://img.shields.io/pypi/pyversions/hcli-hai.svg
:target: https://pypi.org/project/hcli-hai
Raw data
{
"_id": null,
"home_page": null,
"name": "hcli-hai",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "cli, client, server, connector, hypermedia, rest, generic, development",
"author": null,
"author_email": "Jeff Michaud <cometaj2@comcast.net>",
"download_url": "https://files.pythonhosted.org/packages/a6/50/da2441d5b9a6fcf1ccf70408231b9b475adf7c6babfe8604951c5df0f648/hcli_hai-2.1.2.tar.gz",
"platform": null,
"description": "|pypi| |build status| |pyver|\n\nHCLI hai\n========\n\nHCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude models via terminal input and output streams.\n\n----\n\nHCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].\n\nYou can find out more about HCLI on hcli.io [3]\n\n[1] https://github.com/cometaj2/huckle\n\n[2] https://github.com/cometaj2/hcli_core\n\n[3] http://hcli.io\n\nInstallation\n------------\n\nHCLI hai requires a supported version of Python and pip.\n\nYou'll need an HCLI Connector to run hai. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).\n\n\n.. code-block:: console\n\n pip install hcli-hai\n pip install hcli-core\n pip install huckle\n pip install gunicorn\n gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 \"hcli_core:connector(\\\"`hcli_hai path`\\\")\"\n\nUsage\n-----\n\nOpen a different shell window.\n\nSetup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hai).\n\nNote that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving *like* the CLI it targets.\n\n\n.. code-block:: console\n\n huckle cli install http://127.0.0.1:8000\n eval $(huckle env)\n hai help\n\nVersioning\n----------\n\nThis project makes use of semantic versioning (http://semver.org) and may make use of the \"devx\",\n\"prealphax\", \"alphax\" \"betax\", and \"rcx\" extensions where x is a number (e.g. 0.3.0-prealpha1)\non github.\n\nSupports\n--------\n\n- Chatting via input/output streams (e.g. via pipes).\n- .hai folder structure in a users's home directory to help track hai configuration and contexts.\n- Creating, listing, deleting and changing conversation contexts.\n- Automatic title creation based on context\n- Custom context naming to help organize contexts\n- Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).\n\nTo Do\n-----\n\n- A memory layer for the the AI HCLI (hai).\n - Automatic context switching per NLP on received input stream.\n - Context blending to mary different contexts.\n - Automatic context compression to yield a more substantial memory footprint per context window.\n- A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.\n\nBugs\n----\n\nN/A\n\n.. |build status| image:: https://circleci.com/gh/cometaj2/hcli_hai.svg?style=shield\n :target: https://circleci.com/gh/cometaj2/hcli_hai\n.. |pypi| image:: https://img.shields.io/pypi/v/hcli-hai?label=hcli-hai\n :target: https://pypi.org/project/hcli-hai\n.. |pyver| image:: https://img.shields.io/pypi/pyversions/hcli-hai.svg\n :target: https://pypi.org/project/hcli-hai\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude or OpenAI's GPT AI models via terminal input and output streams.",
"version": "2.1.2",
"project_urls": {
"Homepage": "https://github.com/cometaj2/hcli_hai"
},
"split_keywords": [
"cli",
" client",
" server",
" connector",
" hypermedia",
" rest",
" generic",
" development"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e4d6c2d63faff5008d6ba7444bd8eb505114516a00753c8411f345ece4a22375",
"md5": "a9c9f78fc90572724a494918496b529f",
"sha256": "0109b70f33afc0780b10814033761c881a04530efb7263096177778d02c32d8e"
},
"downloads": -1,
"filename": "hcli_hai-2.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a9c9f78fc90572724a494918496b529f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 18879,
"upload_time": "2024-10-31T00:02:11",
"upload_time_iso_8601": "2024-10-31T00:02:11.809015Z",
"url": "https://files.pythonhosted.org/packages/e4/d6/c2d63faff5008d6ba7444bd8eb505114516a00753c8411f345ece4a22375/hcli_hai-2.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "a650da2441d5b9a6fcf1ccf70408231b9b475adf7c6babfe8604951c5df0f648",
"md5": "d29bd4888770ddd4ecc8ac3e0182f6dd",
"sha256": "80d304d53cb35258301b9eeee794a3e1a19a1492912aa717a9d6a482993f52e4"
},
"downloads": -1,
"filename": "hcli_hai-2.1.2.tar.gz",
"has_sig": false,
"md5_digest": "d29bd4888770ddd4ecc8ac3e0182f6dd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 17600,
"upload_time": "2024-10-31T00:02:13",
"upload_time_iso_8601": "2024-10-31T00:02:13.531627Z",
"url": "https://files.pythonhosted.org/packages/a6/50/da2441d5b9a6fcf1ccf70408231b9b475adf7c6babfe8604951c5df0f648/hcli_hai-2.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-31 00:02:13",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "cometaj2",
"github_project": "hcli_hai",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"circle": true,
"requirements": [
{
"name": "tiktoken",
"specs": [
[
"==",
"0.8.0"
]
]
},
{
"name": "anthropic",
"specs": [
[
"==",
"0.36.2"
]
]
}
],
"lcname": "hcli-hai"
}