HandyLLM


NameHandyLLM JSON
Version 0.9.2 PyPI version JSON
download
home_pageNone
SummaryA handy toolkit for using LLM.
upload_time2024-08-09 01:30:32
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords llm large language model prompt openai api
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="middle">
  <img src="https://raw.githubusercontent.com/atomiechen/HandyLLM/main/assets/banner.svg" />
  <p>
    🌟 <b>Support us!</b> <small>If you find HandyLLM useful, please consider <a href="https://github.com/atomiechen/HandyLLM">starring it</a> to help it spread and improve</small> 🌟
  </p>
</div><br>

# HandyLLM

[![GitHub](https://img.shields.io/badge/github-HandyLLM-blue?logo=github)](https://github.com/atomiechen/HandyLLM) [![PyPI](https://img.shields.io/pypi/v/HandyLLM?logo=pypi&logoColor=white)](https://pypi.org/project/HandyLLM/) [![vsmarketplace](https://vsmarketplacebadges.dev/version-short/atomiechen.handyllm.svg)](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm)

A handy toolkit for using LLM, with both [***development support***](https://pypi.org/project/HandyLLM/) and [***VSCode editor support***](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm).



## 🙌 Why HandyLLM?

📃 **Handy Prompt**: self-containing prompt in a human-friendly mark-up format `.hprompt`. 

- **Easy write**: mark-up format, placeholder variables, request arguments, output logs... And most importantly VSCode syntax highlight!
- **Easy run**: both CLI and APIs available for parsing and running; run it with the CLI tool *WITHOUT* any code! 
- **Easy chain**: You can chain `hprompt` files to construct dynamic logic.

<p float="left" align="center">
  <img src="https://raw.githubusercontent.com/atomiechen/vscode-handyllm/main/demo/example.png" width="70%" />
</p>


**Other features:**

☯️ Unified API design with both sync and async support

🍡 OpenAI and Azure APIs all in one

☕️ Easy life with API endpoint management



## Installation

```shell
pip3 install handyllm
```

or, install from the Github repo to get latest updates:

```shell
pip3 install git+https://github.com/atomiechen/handyllm.git
```

Please check [HandyLLM VSCode extension](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm) for editor support.



## Documentation

Please check out our [wiki](https://github.com/atomiechen/HandyLLM/wiki) for comprehensive guides ^_^



## License

[HandyLLM](https://github.com/atomiechen/HandyLLM) © 2024 by [Atomie CHEN](https://github.com/atomiechen) is licensed under the MIT License - see the [LICENSE](https://github.com/atomiechen/HandyLLM/blob/main/LICENSE) file for details.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "HandyLLM",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "LLM, Large Language Model, Prompt, OpenAI, API",
    "author": null,
    "author_email": "Atomie CHEN <atomic_cwh@163.com>",
    "download_url": "https://files.pythonhosted.org/packages/d0/12/7962aeada6fef8ed0df201f94ad3af72a7c2ab18a15a46e492dfe8ef220b/handyllm-0.9.2.tar.gz",
    "platform": null,
    "description": "<div align=\"middle\">\n  <img src=\"https://raw.githubusercontent.com/atomiechen/HandyLLM/main/assets/banner.svg\" />\n  <p>\n    \ud83c\udf1f <b>Support us!</b> <small>If you find HandyLLM useful, please consider <a href=\"https://github.com/atomiechen/HandyLLM\">starring it</a> to help it spread and improve</small> \ud83c\udf1f\n  </p>\n</div><br>\n\n# HandyLLM\n\n[![GitHub](https://img.shields.io/badge/github-HandyLLM-blue?logo=github)](https://github.com/atomiechen/HandyLLM) [![PyPI](https://img.shields.io/pypi/v/HandyLLM?logo=pypi&logoColor=white)](https://pypi.org/project/HandyLLM/) [![vsmarketplace](https://vsmarketplacebadges.dev/version-short/atomiechen.handyllm.svg)](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm)\n\nA handy toolkit for using LLM, with both [***development support***](https://pypi.org/project/HandyLLM/) and [***VSCode editor support***](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm).\n\n\n\n## \ud83d\ude4c Why HandyLLM?\n\n\ud83d\udcc3 **Handy Prompt**: self-containing prompt in a human-friendly mark-up format `.hprompt`. \n\n- **Easy write**: mark-up format, placeholder variables, request arguments, output logs... And most importantly VSCode syntax highlight!\n- **Easy run**: both CLI and APIs available for parsing and running; run it with the CLI tool *WITHOUT* any code! \n- **Easy chain**: You can chain `hprompt` files to construct dynamic logic.\n\n<p float=\"left\" align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/atomiechen/vscode-handyllm/main/demo/example.png\" width=\"70%\" />\n</p>\n\n\n**Other features:**\n\n\u262f\ufe0f Unified API design with both sync and async support\n\n\ud83c\udf61 OpenAI and Azure APIs all in one\n\n\u2615\ufe0f Easy life with API endpoint management\n\n\n\n## Installation\n\n```shell\npip3 install handyllm\n```\n\nor, install from the Github repo to get latest updates:\n\n```shell\npip3 install git+https://github.com/atomiechen/handyllm.git\n```\n\nPlease check [HandyLLM VSCode extension](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm) for editor support.\n\n\n\n## Documentation\n\nPlease check out our [wiki](https://github.com/atomiechen/HandyLLM/wiki) for comprehensive guides ^_^\n\n\n\n## License\n\n[HandyLLM](https://github.com/atomiechen/HandyLLM) \u00a9 2024 by [Atomie CHEN](https://github.com/atomiechen) is licensed under the MIT License - see the [LICENSE](https://github.com/atomiechen/HandyLLM/blob/main/LICENSE) file for details.\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A handy toolkit for using LLM.",
    "version": "0.9.2",
    "project_urls": {
        "Bug Tracker": "https://github.com/atomiechen/HandyLLM/issues",
        "Changelog": "https://github.com/atomiechen/HandyLLM/blob/master/CHANGELOG.md",
        "Homepage": "https://github.com/atomiechen/HandyLLM"
    },
    "split_keywords": [
        "llm",
        " large language model",
        " prompt",
        " openai",
        " api"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6ecb150cfcd5c866e6d5e2b6108625bb11501b796e7e230a72cfa225090faa71",
                "md5": "09d2766280446e33de02571d17bc5471",
                "sha256": "dc1b96484718560bf6aeae3226873731c5b8bf5077315cee0ddb12c2f772df11"
            },
            "downloads": -1,
            "filename": "HandyLLM-0.9.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "09d2766280446e33de02571d17bc5471",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 34381,
            "upload_time": "2024-08-09T01:30:30",
            "upload_time_iso_8601": "2024-08-09T01:30:30.822305Z",
            "url": "https://files.pythonhosted.org/packages/6e/cb/150cfcd5c866e6d5e2b6108625bb11501b796e7e230a72cfa225090faa71/HandyLLM-0.9.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d0127962aeada6fef8ed0df201f94ad3af72a7c2ab18a15a46e492dfe8ef220b",
                "md5": "356fd7ffe08dc74bea836d0bf36660d1",
                "sha256": "2fd36d82dccc730d0d2c41124aa28ce672065e6728ededfa112b64720c324bef"
            },
            "downloads": -1,
            "filename": "handyllm-0.9.2.tar.gz",
            "has_sig": false,
            "md5_digest": "356fd7ffe08dc74bea836d0bf36660d1",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 35162,
            "upload_time": "2024-08-09T01:30:32",
            "upload_time_iso_8601": "2024-08-09T01:30:32.339215Z",
            "url": "https://files.pythonhosted.org/packages/d0/12/7962aeada6fef8ed0df201f94ad3af72a7c2ab18a15a46e492dfe8ef220b/handyllm-0.9.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-09 01:30:32",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "atomiechen",
    "github_project": "HandyLLM",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "handyllm"
}
        
Elapsed time: 1.55961s