unreasonable-llama


Nameunreasonable-llama JSON
Version 0.4.1 PyPI version JSON
download
home_pagehttps://github.com/SteelPh0enix/unreasonable-llama/
SummaryHTTP API bindings for llama.cpp built-in example server
upload_time2024-11-02 12:19:56
maintainerNone
docs_urlNone
authorSteelPh0enix
requires_python<4.0,>=3.12
licenseMIT
keywords llama.cpp llama llm api bindings stateless
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # unreasonable-llama

[![Check code formatting and validity](https://github.com/SteelPh0enix/unreasonable-llama/actions/workflows/check-code.yml/badge.svg?branch=master)](https://github.com/SteelPh0enix/unreasonable-llama/actions/workflows/check-code.yml)

(Yet another) Python API for [llama.cpp server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server)

For now, i'm targeting minimal support necessary for `/completion` and `/health` endpoint.
Maybe i'll extend this lib in the future.

## Requirements

`unreasonable-llama` has a single requirement - `httpx` library, see `pyproject.toml` for details.

Requirements for `llama.cpp` scripts can be found in `requirements/` directory of theirs repository.

## Usage

See [test file](./tests/__main__.py) for example usage.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/SteelPh0enix/unreasonable-llama/",
    "name": "unreasonable-llama",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.12",
    "maintainer_email": null,
    "keywords": "llama.cpp, llama, llm, api, bindings, stateless",
    "author": "SteelPh0enix",
    "author_email": "wojciech_olech@hotmail.com",
    "download_url": "https://files.pythonhosted.org/packages/2d/4b/8f3ad8554937e8f46442131c9436c34cbf0f09a0f016c34d6607f9cfcf55/unreasonable_llama-0.4.1.tar.gz",
    "platform": null,
    "description": "# unreasonable-llama\n\n[![Check code formatting and validity](https://github.com/SteelPh0enix/unreasonable-llama/actions/workflows/check-code.yml/badge.svg?branch=master)](https://github.com/SteelPh0enix/unreasonable-llama/actions/workflows/check-code.yml)\n\n(Yet another) Python API for [llama.cpp server](https://github.com/ggerganov/llama.cpp/tree/master/examples/server)\n\nFor now, i'm targeting minimal support necessary for `/completion` and `/health` endpoint.\nMaybe i'll extend this lib in the future.\n\n## Requirements\n\n`unreasonable-llama` has a single requirement - `httpx` library, see `pyproject.toml` for details.\n\nRequirements for `llama.cpp` scripts can be found in `requirements/` directory of theirs repository.\n\n## Usage\n\nSee [test file](./tests/__main__.py) for example usage.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "HTTP API bindings for llama.cpp built-in example server",
    "version": "0.4.1",
    "project_urls": {
        "Homepage": "https://github.com/SteelPh0enix/unreasonable-llama/",
        "Repository": "https://github.com/SteelPh0enix/unreasonable-llama/"
    },
    "split_keywords": [
        "llama.cpp",
        " llama",
        " llm",
        " api",
        " bindings",
        " stateless"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8ccbdaf94635c88a1dae83d243f7b8a99b12295e4a412a02a7b74d02cf6c5650",
                "md5": "80b432dcfbe5c5a86fdc8ff20d511535",
                "sha256": "98770302c3faac8a534f2753a3d72058f9205aba7a436539e6ef0b7fab74a6e6"
            },
            "downloads": -1,
            "filename": "unreasonable_llama-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "80b432dcfbe5c5a86fdc8ff20d511535",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.12",
            "size": 8764,
            "upload_time": "2024-11-02T12:19:54",
            "upload_time_iso_8601": "2024-11-02T12:19:54.828486Z",
            "url": "https://files.pythonhosted.org/packages/8c/cb/daf94635c88a1dae83d243f7b8a99b12295e4a412a02a7b74d02cf6c5650/unreasonable_llama-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2d4b8f3ad8554937e8f46442131c9436c34cbf0f09a0f016c34d6607f9cfcf55",
                "md5": "5487d49956e5ce2f7fdbde83ba071bc2",
                "sha256": "fdf71327db58858673e6613c2121eee22b6ec0e308faae5684eab995e825a912"
            },
            "downloads": -1,
            "filename": "unreasonable_llama-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "5487d49956e5ce2f7fdbde83ba071bc2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.12",
            "size": 8289,
            "upload_time": "2024-11-02T12:19:56",
            "upload_time_iso_8601": "2024-11-02T12:19:56.419124Z",
            "url": "https://files.pythonhosted.org/packages/2d/4b/8f3ad8554937e8f46442131c9436c34cbf0f09a0f016c34d6607f9cfcf55/unreasonable_llama-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-02 12:19:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "SteelPh0enix",
    "github_project": "unreasonable-llama",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "unreasonable-llama"
}
        
Elapsed time: 1.08736s