huckle


Namehuckle JSON
Version 5.7.5 PyPI version JSON
download
home_pageNone
SummaryA CLI, and python library, that can act as an impostor for any CLI expressed through hypertext command line interface (HCLI) semantics.
upload_time2025-08-21 22:44:08
maintainerNone
docs_urlNone
authorNone
requires_pythonNone
licenseCopyright (c) 2017 Jeff Michaud Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords cli client hypermedia rest generic development
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            |pypi| |build status| |pyver|

Huckle
======

Huckle is a CLI, and python library, that can act as an impostor for any CLI expressed via hypertext
command line interface (HCLI) semantics.

----

Huckle provides a way for developers to interact with, or script around, any API that exposes HCLI
semantics, while providing dynamic and up to date in-band access to all the API/CLI documentation,
man page style, which showcases commands, options, and parameters available for execution.

Most, if not all, programming languages have a way to issue shell commands. With the help
of a generic HCLI client such as Huckle, APIs that make use of HCLI semantics are readily consumable
anywhere via the familiar CLI mode of operation, and this, without there being a need to write
a custom and dedicated CLI to interact with a specific API.

Huckle can also be used as a python library to interact with HCLI APIs via python code in much the
same way as would be the case in a bash terminal.

You can access a simple example HCLI service to play with huckle on http://hcli.io [1]

The HCLI Internet-Draft [2] is a work in progress by the author and 
the current implementation leverages hal+json alongside a static form of ALPS
semantic profile [3] to help enable widespread cross media-type support.

Help shape huckle and HCLI on Discord [4] or by raising issues on github!

[1] http://hcli.io

[2] https://github.com/cometaj2/I-D/tree/master/hcli

[3] http://alps.io

[4] https://discord.gg/H2VeFSgXv2

Install Python, pip and huckle
------------------------------

Huckle requires bash with access to man pages, Python and pip. Install a supported version of Python for your system.

Install huckle via Python's pip:

.. code-block:: console

    pip install huckle

Basic usage
-----------

huckle env

    This provides a sample environment configuration for your PATH environment variable. This can be permanently configured
    for your environment by adding the command 'eval $(huckle env) in your shell startup configuration
    (e.g. .bashrc, .bash_profile, .profile)

huckle cli install \<url>

    This attempts to auto create and configure a CLI name if provided with the root URL of an HCLI API.
    If successful, the CLI can be invoked by name, after updating the path (see 'huckle env'). You can permanently enable
    HCLI entrypoint scripts by adding 'eval $(huckle env) to your a ~/.bashrc, ~/.bash_profile, or ~/.profile.

    Note that an existing configuration file is left alone if the command is run multiple times 
    for the same CLI.

    An example HCLI that can be used with Huckle is available on hcli.io:
        - `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_ (HCLI root)  
        - `<http://hcli.io/hal/#/hcli/cli/jsonf?command=jsonf>`_ (HAL Browser navigation)

    Alternatively, a WSGI application can be stood up very quickly using sample HCLIs available via hcli_core `<https://pypi.org/project/hcli-core/>`_

huckle cli run \<cliname>

    This invokes the cliname to issue HCLI API calls; the details of which are left to API implementers.

    Commands, options and parameters are presented gradually, to provide users with a way to
    incrementally discover and learn how the CLI is used.

\<cliname> ...

    For brevity, the CLI name can and should be invoked directly rather than through "huckle cli run \<cliname>.

\<cliname> ... help

    The reserved "help" command can be used anywhere in a command line sequence to have huckle generate
    man page like documentation from the last successfully received HCLI Document. This helps with CLI exploration.

huckle help

    This opens up a man page that describes how to use huckle.

Python Library - Basic Usage
----------------------------

Here's a bit of flask web frontend logic from haillo (`<https://github.com/cometaj2/haillo/haillo.py>`_) that 
incorporates huckle usage as a python library to get data from an HCLI AI Chat application called 
'hai' (`<https://github.com/cometaj2/hcli_hai>`_).

.. code-block:: python

    import flask
    import config
    import json
    import logger
    import sys
    import io
    import traceback
    import contextlib
    import urllib3
    import time
    from huckle import cli, stdin
    from flask import Flask, render_template, send_file, jsonify, Response, redirect, url_for, request, render_template_string
    import subprocess
    import ast

    logging = logger.Logger()
    logging.setLevel(logger.INFO)

    app = None


    def get_chat_list():
        try:
            chunks = cli("hai ls --json")
            json_string = ""
            for dest, chunk in chunks:
                if dest == 'stdout':
                    json_string += chunk.decode()
            chats = json.loads(json_string)
            # Sort the list with most recent dates first
            sorted_chats = sorted(chats, key=lambda x: x['update_time'], reverse=True)
            return sorted_chats
        except Exception as error:
            logging.error(f"Error getting chat list: {error}")
            return []

    def parse_context(context_str):
        try:
            context_data = json.loads(context_str)
            return {
                'messages': context_data.get('messages', []),
                'name': context_data.get('name', ''),
                'title': context_data.get('title', '')
            }
        except json.JSONDecodeError as e:
            logging.error(f"Error parsing context JSON: {e}")
            return {'messages': [], 'name': '', 'title': ''}

    def webapp():
        app = Flask(__name__)

        @app.route('/')
        def index():
            try:
                # Get the current context
                chunks = cli("hai context --json")
                context_str = ""
                for dest, chunk in chunks:
                    if dest == 'stdout':
                        context_str += chunk.decode()

                # Get chat list for sidebar
                chats = get_chat_list()

                # Get model in use
                chunks = cli(f"hai model --json")
                model = ""
                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)
                    if dest == 'stdout':
                        model += chunk.decode()

                # Convert to a python list
                model = ast.literal_eval(model)[0]

                # Get models list
                chunks = cli(f"hai model ls --json")
                models = ""
                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)
                    if dest == 'stdout':
                        models += chunk.decode()

                # Convert to a python list
                models = ast.literal_eval(models)

                # Parse the context into structured data
                context_data = parse_context(context_str)

                popup_message = request.args.get('popup')  # Get from query param
                return render_template('index.html',
                                        messages=context_data['messages'],
                                        name=context_data['name'],
                                        title=context_data['title'],
                                        chats=chats,
                                        model=model,
                                        models=models,
                                        popup_message=popup_message)
            except Exception as error:
                logging.error(traceback.format_exc())
                return render_template('index.html',
                                        messages=[],
                                        name='',
                                        title='',
                                        chats=[],
                                        model=None,
                                        models=[],
                                        popup_message=None)

        @app.route('/chat_history')
        def chat_history():
            try:
                # Get chat list for sidebar
                chats = get_chat_list()

                # Get model in use
                chunks = cli(f"hai model --json")
                model = ""
                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)
                    if dest == 'stdout':
                        model += chunk.decode()

                # Convert to a python list
                model = ast.literal_eval(model)[0]

                # Get models list
                chunks = cli(f"hai model ls --json")
                models = ""
                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)
                    if dest == 'stdout':
                        models += chunk.decode()

                # Convert to a python list
                models = ast.literal_eval(models)

                popup_message = request.args.get('popup')  # Get from query param
                return render_template('chat_history.html', chats=chats,
                                        model=model,
                                        models=models,
                                        popup_message=popup_message)
            except Exception as error:
                logging.error(traceback.format_exc())

            return render_template('chat_history.html', chats=[],
                                    model=None,
                                    models=[],
                                    popup_message=None)

        # We select and set a chat context
        @app.route('/context/<context_id>')
        def navigate_context(context_id):
            try:
                logging.info("Switching to context_id " + context_id)

                chunks = cli(f"hai set {context_id}")
                stderr_output = ""
                stdout_output = ""
                for dest, chunk in chunks:
                    if dest == 'stderr':
                        stderr_output += chunk.decode()
                    elif dest == 'stdout':
                        stdout_output += chunk.decode()

                if stderr_output:
                    logging.error(f"{stderr_output}")
                    return redirect(url_for('index', popup=f"{stderr_output}"))

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Context switch failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We delete a chat context with hai rm
        @app.route('/context/<context_id>', methods=['POST'])
        def delete_context(context_id):
            try:
                logging.info("Removing context_id " + context_id)

                chunks = cli(f"hai rm {context_id}")
                for dest, chunk in chunks:
                    pass

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Context deletion failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We stream chat data to hai
        @app.route('/chat', methods=['POST'])
        def chat():
            try:
                message = request.form.get('message')
                logging.info("Sending message to selected model...")
                stream = io.BytesIO(message.encode('utf-8'))

                with stdin(stream):
                    chunks = cli(f"hai")
                    stderr_output = ""
                    stdout_output = ""
                    for dest, chunk in chunks:
                        if dest == 'stderr':
                            stderr_output += chunk.decode()
                        elif dest == 'stdout':
                            stdout_output += chunk.decode()

                    if stderr_output:
                        logging.error(f"{stderr_output}")
                        return redirect(url_for('index', popup=f"{stderr_output}"))

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Context switch failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We set the model with hai model set
        @app.route('/set_model', methods=['POST'])
        def set_model():
            try:
                model = request.form.get('model')
                logging.info(f"Setting model to {model}")

                chunks = cli(f"hai model set {model}")
                for dest, chunk in chunks:
                    pass

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Model switch failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We create a new hai chat context with hai new
        @app.route('/new_chat', methods=['POST'])
        def new_chat():
            try:
                chunks = cli("hai new")
                stderr_output = ""
                stdout_output = ""
                for dest, chunk in chunks:
                    if dest == 'stderr':
                        stderr_output += chunk.decode()
                    elif dest == 'stdout':
                        stdout_output += chunk.decode()

                if stderr_output:
                    logging.error(f"{stderr_output}")
                    return redirect(url_for('index', popup=f"{stderr_output}"))

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"New chat creation failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We start vibing
        @app.route('/vibestart', methods=['POST'])
        def vibe_start():
            try:
                chunks = cli("hai vibe start")
                for dest, chunk in chunks:
                    pass

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Vibing failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        # We stop vibing
        @app.route('/vibestop', methods=['POST'])
        def vibe_stop():
            try:
                chunks = cli("hai vibe stop")
                for dest, chunk in chunks:
                    pass

                return redirect(url_for('index'))
            except Exception as error:
                logging.error(f"Vibing failed: {error}")
                logging.error(traceback.format_exc())
                return redirect(url_for('index'))

        @app.route('/vibe_status', methods=['GET'])
        def vibe_status():
            chunks = cli("hai vibe status")
            status_str = ""
            for dest, chunk in chunks:
                if dest == 'stdout':
                    status_str += chunk.decode()
            return status_str

        @app.route('/manifest.json')
        def serve_manifest():
            return app.send_static_file('manifest.json')

        return app

    app = webapp()

Configuration
-------------

Huckle uses small scripts under ~/.huckle/bin to enable CLIs to be invoked by name.

Huckle also uses CLI configuration files (e.g. ~/.huckle/etc/\<cliname>/config) to associate a specific
CLI to an HCLI API root URL and other CLI specific configuration.

Versioning
----------

This project makes use of semantic versioning (http://semver.org) and may make use of the "devx",
"prealphax", "alphax" "betax", and "rcx" extensions where x is a number (e.g. 0.3.0-prealpha1)
on github. Only full major.minor.patch releases will be pushed to pip from now on.

Supports
--------

- HTTP/HTTPS

- Support various authentication and/or passthrough per CLI configuration

    - HTTP Basic Auth
    - HCLI Core API Key Authentication (HCOAK)

- HCLI version 1.0 semantics for:

    - hal+json

- Automatic man page like documentation generation with the "help" command, anywhere in a CLI.

- Command line execution responses for

    - All media types

- Streaming:

    - Handles very large stdin/stdout streams (fixed chunk size of 16834)

- SOCKS tunneling through environment variables (ALL_PROXY)

- Auto configuration of a CLI when provided with an HCLI API root URL (e.g. huckle cli install `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_)

- Listing of installed CLIs

- Listing of the configuration of a CLI

- Auto discovery of cli link relations when attempting to install from a root resource that isn't an hcli-document.

- URL pinning/caching, and cache flushing, of successfully traversed final execution URLs, to speed up execution of already executed command sequences.

- Use as a python library along with simple stdin-and-stdout-like data streaming.

- RFC 9457, per HCLI specification, to help yield consistent stderr output.

- Customizable logging and log level configuration for debugging and for stderr messages.

- Huckle HCLI configuration and credentials management via huckle commands

- Keyring as credential helper (see `<https://github.com/jaraco/keyring>`_)

- Configurable text only or man page help output to help support python library use (no man pages)

To Do
-----

- Fork restnavigator repo or otherwise adjust to use restnavigator with requests (single http client instead of two)

- Support HCLI version 1.0 semantics for: 

    - Collection+JSON
    - hal+xml
    - Uber
    - HTML
    - Siren
    - JSON-LD
    - JSON API
    - Mason

- Support stream configuration

    - sending and receiving streams (configurable via CLI config)
    - sending and receiving non-streams (configuration via CLI config)
    - chunk size for streams send/receive (configurable via CLI config)

- Support non-stream send/receive (via CLI configuration)

- Support various authentication and/or passthrough per CLI configuration

    - HTTP Digest
    - Oauth2
    - X509 (HTTPS mutual authentication)
    - AWS
    - SAML

- Better implementation for huckle params/options handling

- Support for viewing information about an HCLI root (e.g. huckle view `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_)

- Support forward proxy configuration through proxy environment variables (HTTP_PROXY, HTTPS_PROXY)

- Support hcli name conflic resolution (use namespaces?)

    - View currently selected namespace (e.g. huckle ns)
    - Viewing namespace list (e.g. huckle ns list)
    - Selecting a namespace (e.g. huckle ns use abc)
    - Remove an entire namespace and all associated CLIs (e.g. huckle ns rm abc)
    - Support adding and removing CLIs to namespaces

- Support multipart/form-data for very large uploads (see requests-toolbelt)

- Support HCLI nativization as outlined in the HCLI specification

- Support better help output for python library use

- Support better Huckle configuration and HCLI customization for python library use

- Support full in memory configuration use to avoid filesystem files in a python library use context

- Add circleci tests for python library use (input and output streaming)

- Add configuration to support auth throughout (HCLI + API) or only against the final API calls

Bugs
----

- An old cache (pinned urls) can sometimes yield unexpected failures. This has been observed with hcli_hc.

Known Issues
------------

- If developing using pip editable installs on huckle and/or code referencing huckle as a library, an accidental working directory subfolder named huckle, anywhere you happen to be when executing related code, may prevent huckle from being imported properly as a library. This is a known side effect of PEP660 and import overshadowing on pip editable installs. This can be mitigated by ensuring no such subfolder exists where you are working, and may be by changing editable installs to compability mode (e.g. pip install -e . --config-settings editable_mode=compat). See https://github.com/pypa/setuptools/issues/3548.

.. |build status| image:: https://circleci.com/gh/cometaj2/huckle.svg?style=shield
   :target: https://circleci.com/gh/cometaj2/huckle
.. |pypi| image:: https://img.shields.io/pypi/v/huckle?label=huckle
   :target: https://pypi.org/project/huckle
.. |pyver| image:: https://img.shields.io/pypi/pyversions/huckle.svg
   :target: https://pypi.org/project/huckle

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "huckle",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "cli, client, hypermedia, rest, generic, development",
    "author": null,
    "author_email": "Jeff Michaud <cometaj2@proton.me>",
    "download_url": "https://files.pythonhosted.org/packages/db/bf/7b53fa9f8f5791b4d3776ca23be1400b550293155ad5b65641af8480284c/huckle-5.7.5.tar.gz",
    "platform": null,
    "description": "|pypi| |build status| |pyver|\n\nHuckle\n======\n\nHuckle is a CLI, and python library, that can act as an impostor for any CLI expressed via hypertext\ncommand line interface (HCLI) semantics.\n\n----\n\nHuckle provides a way for developers to interact with, or script around, any API that exposes HCLI\nsemantics, while providing dynamic and up to date in-band access to all the API/CLI documentation,\nman page style, which showcases commands, options, and parameters available for execution.\n\nMost, if not all, programming languages have a way to issue shell commands. With the help\nof a generic HCLI client such as Huckle, APIs that make use of HCLI semantics are readily consumable\nanywhere via the familiar CLI mode of operation, and this, without there being a need to write\na custom and dedicated CLI to interact with a specific API.\n\nHuckle can also be used as a python library to interact with HCLI APIs via python code in much the\nsame way as would be the case in a bash terminal.\n\nYou can access a simple example HCLI service to play with huckle on http://hcli.io [1]\n\nThe HCLI Internet-Draft [2] is a work in progress by the author and \nthe current implementation leverages hal+json alongside a static form of ALPS\nsemantic profile [3] to help enable widespread cross media-type support.\n\nHelp shape huckle and HCLI on Discord [4] or by raising issues on github!\n\n[1] http://hcli.io\n\n[2] https://github.com/cometaj2/I-D/tree/master/hcli\n\n[3] http://alps.io\n\n[4] https://discord.gg/H2VeFSgXv2\n\nInstall Python, pip and huckle\n------------------------------\n\nHuckle requires bash with access to man pages, Python and pip. Install a supported version of Python for your system.\n\nInstall huckle via Python's pip:\n\n.. code-block:: console\n\n    pip install huckle\n\nBasic usage\n-----------\n\nhuckle env\n\n    This provides a sample environment configuration for your PATH environment variable. This can be permanently configured\n    for your environment by adding the command 'eval $(huckle env) in your shell startup configuration\n    (e.g. .bashrc, .bash_profile, .profile)\n\nhuckle cli install \\<url>\n\n    This attempts to auto create and configure a CLI name if provided with the root URL of an HCLI API.\n    If successful, the CLI can be invoked by name, after updating the path (see 'huckle env'). You can permanently enable\n    HCLI entrypoint scripts by adding 'eval $(huckle env) to your a ~/.bashrc, ~/.bash_profile, or ~/.profile.\n\n    Note that an existing configuration file is left alone if the command is run multiple times \n    for the same CLI.\n\n    An example HCLI that can be used with Huckle is available on hcli.io:\n        - `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_ (HCLI root)  \n        - `<http://hcli.io/hal/#/hcli/cli/jsonf?command=jsonf>`_ (HAL Browser navigation)\n\n    Alternatively, a WSGI application can be stood up very quickly using sample HCLIs available via hcli_core `<https://pypi.org/project/hcli-core/>`_\n\nhuckle cli run \\<cliname>\n\n    This invokes the cliname to issue HCLI API calls; the details of which are left to API implementers.\n\n    Commands, options and parameters are presented gradually, to provide users with a way to\n    incrementally discover and learn how the CLI is used.\n\n\\<cliname> ...\n\n    For brevity, the CLI name can and should be invoked directly rather than through \"huckle cli run \\<cliname>.\n\n\\<cliname> ... help\n\n    The reserved \"help\" command can be used anywhere in a command line sequence to have huckle generate\n    man page like documentation from the last successfully received HCLI Document. This helps with CLI exploration.\n\nhuckle help\n\n    This opens up a man page that describes how to use huckle.\n\nPython Library - Basic Usage\n----------------------------\n\nHere's a bit of flask web frontend logic from haillo (`<https://github.com/cometaj2/haillo/haillo.py>`_) that \nincorporates huckle usage as a python library to get data from an HCLI AI Chat application called \n'hai' (`<https://github.com/cometaj2/hcli_hai>`_).\n\n.. code-block:: python\n\n    import flask\n    import config\n    import json\n    import logger\n    import sys\n    import io\n    import traceback\n    import contextlib\n    import urllib3\n    import time\n    from huckle import cli, stdin\n    from flask import Flask, render_template, send_file, jsonify, Response, redirect, url_for, request, render_template_string\n    import subprocess\n    import ast\n\n    logging = logger.Logger()\n    logging.setLevel(logger.INFO)\n\n    app = None\n\n\n    def get_chat_list():\n        try:\n            chunks = cli(\"hai ls --json\")\n            json_string = \"\"\n            for dest, chunk in chunks:\n                if dest == 'stdout':\n                    json_string += chunk.decode()\n            chats = json.loads(json_string)\n            # Sort the list with most recent dates first\n            sorted_chats = sorted(chats, key=lambda x: x['update_time'], reverse=True)\n            return sorted_chats\n        except Exception as error:\n            logging.error(f\"Error getting chat list: {error}\")\n            return []\n\n    def parse_context(context_str):\n        try:\n            context_data = json.loads(context_str)\n            return {\n                'messages': context_data.get('messages', []),\n                'name': context_data.get('name', ''),\n                'title': context_data.get('title', '')\n            }\n        except json.JSONDecodeError as e:\n            logging.error(f\"Error parsing context JSON: {e}\")\n            return {'messages': [], 'name': '', 'title': ''}\n\n    def webapp():\n        app = Flask(__name__)\n\n        @app.route('/')\n        def index():\n            try:\n                # Get the current context\n                chunks = cli(\"hai context --json\")\n                context_str = \"\"\n                for dest, chunk in chunks:\n                    if dest == 'stdout':\n                        context_str += chunk.decode()\n\n                # Get chat list for sidebar\n                chats = get_chat_list()\n\n                # Get model in use\n                chunks = cli(f\"hai model --json\")\n                model = \"\"\n                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)\n                    if dest == 'stdout':\n                        model += chunk.decode()\n\n                # Convert to a python list\n                model = ast.literal_eval(model)[0]\n\n                # Get models list\n                chunks = cli(f\"hai model ls --json\")\n                models = \"\"\n                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)\n                    if dest == 'stdout':\n                        models += chunk.decode()\n\n                # Convert to a python list\n                models = ast.literal_eval(models)\n\n                # Parse the context into structured data\n                context_data = parse_context(context_str)\n\n                popup_message = request.args.get('popup')  # Get from query param\n                return render_template('index.html',\n                                        messages=context_data['messages'],\n                                        name=context_data['name'],\n                                        title=context_data['title'],\n                                        chats=chats,\n                                        model=model,\n                                        models=models,\n                                        popup_message=popup_message)\n            except Exception as error:\n                logging.error(traceback.format_exc())\n                return render_template('index.html',\n                                        messages=[],\n                                        name='',\n                                        title='',\n                                        chats=[],\n                                        model=None,\n                                        models=[],\n                                        popup_message=None)\n\n        @app.route('/chat_history')\n        def chat_history():\n            try:\n                # Get chat list for sidebar\n                chats = get_chat_list()\n\n                # Get model in use\n                chunks = cli(f\"hai model --json\")\n                model = \"\"\n                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)\n                    if dest == 'stdout':\n                        model += chunk.decode()\n\n                # Convert to a python list\n                model = ast.literal_eval(model)[0]\n\n                # Get models list\n                chunks = cli(f\"hai model ls --json\")\n                models = \"\"\n                for dest, chunk in chunks:  # Now unpacking tuple of (dest, chunk)\n                    if dest == 'stdout':\n                        models += chunk.decode()\n\n                # Convert to a python list\n                models = ast.literal_eval(models)\n\n                popup_message = request.args.get('popup')  # Get from query param\n                return render_template('chat_history.html', chats=chats,\n                                        model=model,\n                                        models=models,\n                                        popup_message=popup_message)\n            except Exception as error:\n                logging.error(traceback.format_exc())\n\n            return render_template('chat_history.html', chats=[],\n                                    model=None,\n                                    models=[],\n                                    popup_message=None)\n\n        # We select and set a chat context\n        @app.route('/context/<context_id>')\n        def navigate_context(context_id):\n            try:\n                logging.info(\"Switching to context_id \" + context_id)\n\n                chunks = cli(f\"hai set {context_id}\")\n                stderr_output = \"\"\n                stdout_output = \"\"\n                for dest, chunk in chunks:\n                    if dest == 'stderr':\n                        stderr_output += chunk.decode()\n                    elif dest == 'stdout':\n                        stdout_output += chunk.decode()\n\n                if stderr_output:\n                    logging.error(f\"{stderr_output}\")\n                    return redirect(url_for('index', popup=f\"{stderr_output}\"))\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Context switch failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We delete a chat context with hai rm\n        @app.route('/context/<context_id>', methods=['POST'])\n        def delete_context(context_id):\n            try:\n                logging.info(\"Removing context_id \" + context_id)\n\n                chunks = cli(f\"hai rm {context_id}\")\n                for dest, chunk in chunks:\n                    pass\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Context deletion failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We stream chat data to hai\n        @app.route('/chat', methods=['POST'])\n        def chat():\n            try:\n                message = request.form.get('message')\n                logging.info(\"Sending message to selected model...\")\n                stream = io.BytesIO(message.encode('utf-8'))\n\n                with stdin(stream):\n                    chunks = cli(f\"hai\")\n                    stderr_output = \"\"\n                    stdout_output = \"\"\n                    for dest, chunk in chunks:\n                        if dest == 'stderr':\n                            stderr_output += chunk.decode()\n                        elif dest == 'stdout':\n                            stdout_output += chunk.decode()\n\n                    if stderr_output:\n                        logging.error(f\"{stderr_output}\")\n                        return redirect(url_for('index', popup=f\"{stderr_output}\"))\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Context switch failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We set the model with hai model set\n        @app.route('/set_model', methods=['POST'])\n        def set_model():\n            try:\n                model = request.form.get('model')\n                logging.info(f\"Setting model to {model}\")\n\n                chunks = cli(f\"hai model set {model}\")\n                for dest, chunk in chunks:\n                    pass\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Model switch failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We create a new hai chat context with hai new\n        @app.route('/new_chat', methods=['POST'])\n        def new_chat():\n            try:\n                chunks = cli(\"hai new\")\n                stderr_output = \"\"\n                stdout_output = \"\"\n                for dest, chunk in chunks:\n                    if dest == 'stderr':\n                        stderr_output += chunk.decode()\n                    elif dest == 'stdout':\n                        stdout_output += chunk.decode()\n\n                if stderr_output:\n                    logging.error(f\"{stderr_output}\")\n                    return redirect(url_for('index', popup=f\"{stderr_output}\"))\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"New chat creation failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We start vibing\n        @app.route('/vibestart', methods=['POST'])\n        def vibe_start():\n            try:\n                chunks = cli(\"hai vibe start\")\n                for dest, chunk in chunks:\n                    pass\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Vibing failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        # We stop vibing\n        @app.route('/vibestop', methods=['POST'])\n        def vibe_stop():\n            try:\n                chunks = cli(\"hai vibe stop\")\n                for dest, chunk in chunks:\n                    pass\n\n                return redirect(url_for('index'))\n            except Exception as error:\n                logging.error(f\"Vibing failed: {error}\")\n                logging.error(traceback.format_exc())\n                return redirect(url_for('index'))\n\n        @app.route('/vibe_status', methods=['GET'])\n        def vibe_status():\n            chunks = cli(\"hai vibe status\")\n            status_str = \"\"\n            for dest, chunk in chunks:\n                if dest == 'stdout':\n                    status_str += chunk.decode()\n            return status_str\n\n        @app.route('/manifest.json')\n        def serve_manifest():\n            return app.send_static_file('manifest.json')\n\n        return app\n\n    app = webapp()\n\nConfiguration\n-------------\n\nHuckle uses small scripts under ~/.huckle/bin to enable CLIs to be invoked by name.\n\nHuckle also uses CLI configuration files (e.g. ~/.huckle/etc/\\<cliname>/config) to associate a specific\nCLI to an HCLI API root URL and other CLI specific configuration.\n\nVersioning\n----------\n\nThis project makes use of semantic versioning (http://semver.org) and may make use of the \"devx\",\n\"prealphax\", \"alphax\" \"betax\", and \"rcx\" extensions where x is a number (e.g. 0.3.0-prealpha1)\non github. Only full major.minor.patch releases will be pushed to pip from now on.\n\nSupports\n--------\n\n- HTTP/HTTPS\n\n- Support various authentication and/or passthrough per CLI configuration\n\n    - HTTP Basic Auth\n    - HCLI Core API Key Authentication (HCOAK)\n\n- HCLI version 1.0 semantics for:\n\n    - hal+json\n\n- Automatic man page like documentation generation with the \"help\" command, anywhere in a CLI.\n\n- Command line execution responses for\n\n    - All media types\n\n- Streaming:\n\n    - Handles very large stdin/stdout streams (fixed chunk size of 16834)\n\n- SOCKS tunneling through environment variables (ALL_PROXY)\n\n- Auto configuration of a CLI when provided with an HCLI API root URL (e.g. huckle cli install `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_)\n\n- Listing of installed CLIs\n\n- Listing of the configuration of a CLI\n\n- Auto discovery of cli link relations when attempting to install from a root resource that isn't an hcli-document.\n\n- URL pinning/caching, and cache flushing, of successfully traversed final execution URLs, to speed up execution of already executed command sequences.\n\n- Use as a python library along with simple stdin-and-stdout-like data streaming.\n\n- RFC 9457, per HCLI specification, to help yield consistent stderr output.\n\n- Customizable logging and log level configuration for debugging and for stderr messages.\n\n- Huckle HCLI configuration and credentials management via huckle commands\n\n- Keyring as credential helper (see `<https://github.com/jaraco/keyring>`_)\n\n- Configurable text only or man page help output to help support python library use (no man pages)\n\nTo Do\n-----\n\n- Fork restnavigator repo or otherwise adjust to use restnavigator with requests (single http client instead of two)\n\n- Support HCLI version 1.0 semantics for: \n\n    - Collection+JSON\n    - hal+xml\n    - Uber\n    - HTML\n    - Siren\n    - JSON-LD\n    - JSON API\n    - Mason\n\n- Support stream configuration\n\n    - sending and receiving streams (configurable via CLI config)\n    - sending and receiving non-streams (configuration via CLI config)\n    - chunk size for streams send/receive (configurable via CLI config)\n\n- Support non-stream send/receive (via CLI configuration)\n\n- Support various authentication and/or passthrough per CLI configuration\n\n    - HTTP Digest\n    - Oauth2\n    - X509 (HTTPS mutual authentication)\n    - AWS\n    - SAML\n\n- Better implementation for huckle params/options handling\n\n- Support for viewing information about an HCLI root (e.g. huckle view `<http://hcli.io/hcli/cli/jsonf?command=jsonf>`_)\n\n- Support forward proxy configuration through proxy environment variables (HTTP_PROXY, HTTPS_PROXY)\n\n- Support hcli name conflic resolution (use namespaces?)\n\n    - View currently selected namespace (e.g. huckle ns)\n    - Viewing namespace list (e.g. huckle ns list)\n    - Selecting a namespace (e.g. huckle ns use abc)\n    - Remove an entire namespace and all associated CLIs (e.g. huckle ns rm abc)\n    - Support adding and removing CLIs to namespaces\n\n- Support multipart/form-data for very large uploads (see requests-toolbelt)\n\n- Support HCLI nativization as outlined in the HCLI specification\n\n- Support better help output for python library use\n\n- Support better Huckle configuration and HCLI customization for python library use\n\n- Support full in memory configuration use to avoid filesystem files in a python library use context\n\n- Add circleci tests for python library use (input and output streaming)\n\n- Add configuration to support auth throughout (HCLI + API) or only against the final API calls\n\nBugs\n----\n\n- An old cache (pinned urls) can sometimes yield unexpected failures. This has been observed with hcli_hc.\n\nKnown Issues\n------------\n\n- If developing using pip editable installs on huckle and/or code referencing huckle as a library, an accidental working directory subfolder named huckle, anywhere you happen to be when executing related code, may prevent huckle from being imported properly as a library. This is a known side effect of PEP660 and import overshadowing on pip editable installs. This can be mitigated by ensuring no such subfolder exists where you are working, and may be by changing editable installs to compability mode (e.g. pip install -e . --config-settings editable_mode=compat). See https://github.com/pypa/setuptools/issues/3548.\n\n.. |build status| image:: https://circleci.com/gh/cometaj2/huckle.svg?style=shield\n   :target: https://circleci.com/gh/cometaj2/huckle\n.. |pypi| image:: https://img.shields.io/pypi/v/huckle?label=huckle\n   :target: https://pypi.org/project/huckle\n.. |pyver| image:: https://img.shields.io/pypi/pyversions/huckle.svg\n   :target: https://pypi.org/project/huckle\n",
    "bugtrack_url": null,
    "license": "Copyright (c) 2017 Jeff Michaud\n        \n        Permission is hereby granted, free of charge, to any person obtaining a copy of\n        this software and associated documentation files (the \"Software\"), to deal in\n        the Software without restriction, including without limitation the rights to\n        use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\n        of the Software, and to permit persons to whom the Software is furnished to do\n        so, subject to the following conditions:\n        \n        The above copyright notice and this permission notice shall be included in all\n        copies or substantial portions of the Software.\n        \n        THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n        SOFTWARE.\n        ",
    "summary": "A CLI, and python library, that can act as an impostor for any CLI expressed through hypertext command line interface (HCLI) semantics.",
    "version": "5.7.5",
    "project_urls": {
        "Homepage": "https://github.com/cometaj2/huckle"
    },
    "split_keywords": [
        "cli",
        " client",
        " hypermedia",
        " rest",
        " generic",
        " development"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "54c1637f6466c67b12aef7691a72fe80c070aa6bea42b8fef96228cd4c391336",
                "md5": "9e700f0ae886e5578bb155832490c9e9",
                "sha256": "abe33c37a988c82cd0134b003811cb06a8c5f27c2016743007c2e65ca80b0247"
            },
            "downloads": -1,
            "filename": "huckle-5.7.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9e700f0ae886e5578bb155832490c9e9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 29778,
            "upload_time": "2025-08-21T22:44:07",
            "upload_time_iso_8601": "2025-08-21T22:44:07.087586Z",
            "url": "https://files.pythonhosted.org/packages/54/c1/637f6466c67b12aef7691a72fe80c070aa6bea42b8fef96228cd4c391336/huckle-5.7.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "dbbf7b53fa9f8f5791b4d3776ca23be1400b550293155ad5b65641af8480284c",
                "md5": "8451e47dee5177fc37abb6f0968d53ea",
                "sha256": "7d53b4d096644b794e0eae0dc317f114b37212f95fb9fcd38d843f472fac2c46"
            },
            "downloads": -1,
            "filename": "huckle-5.7.5.tar.gz",
            "has_sig": false,
            "md5_digest": "8451e47dee5177fc37abb6f0968d53ea",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 33778,
            "upload_time": "2025-08-21T22:44:08",
            "upload_time_iso_8601": "2025-08-21T22:44:08.402258Z",
            "url": "https://files.pythonhosted.org/packages/db/bf/7b53fa9f8f5791b4d3776ca23be1400b550293155ad5b65641af8480284c/huckle-5.7.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-21 22:44:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "cometaj2",
    "github_project": "huckle",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "circle": true,
    "lcname": "huckle"
}
        
Elapsed time: 1.47361s