HandyLLM


NameHandyLLM JSON
Version 0.7.1 PyPI version JSON
download
home_pageNone
SummaryA handy toolkit for using LLM.
upload_time2024-05-07 12:11:53
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseNone
keywords llm large language model prompt openai api
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="middle">
  <img src="https://raw.githubusercontent.com/atomiechen/HandyLLM/main/assets/banner.svg" />
</div><br>

# HandyLLM

[![GitHub](https://img.shields.io/badge/github-HandyLLM-blue?logo=github)](https://github.com/atomiechen/HandyLLM) [![PyPI](https://img.shields.io/pypi/v/HandyLLM?logo=pypi&logoColor=white)](https://pypi.org/project/HandyLLM/) [![vsmarketplace](https://vsmarketplacebadges.dev/version-short/atomiechen.handyllm.svg)](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm)

A handy toolkit for using LLM, with both [***development support***](https://pypi.org/project/HandyLLM/) and [***VSCode editor support***](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm).



## 🌟 Why HandyLLM?

📃 **Handy Prompt**: self-containing prompt in a human-friendly mark-up format `.hprompt`. 

- **Easy write**: mark-up format, placeholder variables, request arguments, output logs... And most importantly VSCode syntax highlight!
- **Easy run**: both CLI and APIs available for parsing and running; run it with the CLI tool *WITHOUT* any code! 
- **Easy chain**: You can chain `hprompt` files to construct dynamic logic.

<p float="left" align="center">
  <img src="https://raw.githubusercontent.com/atomiechen/vscode-handyllm/main/demo/example.jpg" width="60%" />
</p>

**Other features:**

☯️ Unified API design with both sync and async support

🍡 OpenAI and Azure APIs all in one

☕️ Easy life with API endpoint management



## Installation

```shell
pip3 install handyllm
```

or, install from the Github repo to get latest updates:

```shell
pip3 install git+https://github.com/atomiechen/handyllm.git
```

Please check [HandyLLM VSCode extension](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm) for editor support.



## Documentation

Please check out our [wiki](https://github.com/atomiechen/HandyLLM/wiki) for comprehensive guides ^_^


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "HandyLLM",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "LLM, Large Language Model, Prompt, OpenAI, API",
    "author": null,
    "author_email": "Atomie CHEN <atomic_cwh@163.com>",
    "download_url": "https://files.pythonhosted.org/packages/f0/e9/3c1ba3d493c2d19ad818a4af4ffb205ac52b0eab7a7f9be8a7ac24fdead6/handyllm-0.7.1.tar.gz",
    "platform": null,
    "description": "<div align=\"middle\">\n  <img src=\"https://raw.githubusercontent.com/atomiechen/HandyLLM/main/assets/banner.svg\" />\n</div><br>\n\n# HandyLLM\n\n[![GitHub](https://img.shields.io/badge/github-HandyLLM-blue?logo=github)](https://github.com/atomiechen/HandyLLM) [![PyPI](https://img.shields.io/pypi/v/HandyLLM?logo=pypi&logoColor=white)](https://pypi.org/project/HandyLLM/) [![vsmarketplace](https://vsmarketplacebadges.dev/version-short/atomiechen.handyllm.svg)](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm)\n\nA handy toolkit for using LLM, with both [***development support***](https://pypi.org/project/HandyLLM/) and [***VSCode editor support***](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm).\n\n\n\n## \ud83c\udf1f Why HandyLLM?\n\n\ud83d\udcc3 **Handy Prompt**: self-containing prompt in a human-friendly mark-up format `.hprompt`. \n\n- **Easy write**: mark-up format, placeholder variables, request arguments, output logs... And most importantly VSCode syntax highlight!\n- **Easy run**: both CLI and APIs available for parsing and running; run it with the CLI tool *WITHOUT* any code! \n- **Easy chain**: You can chain `hprompt` files to construct dynamic logic.\n\n<p float=\"left\" align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/atomiechen/vscode-handyllm/main/demo/example.jpg\" width=\"60%\" />\n</p>\n\n**Other features:**\n\n\u262f\ufe0f Unified API design with both sync and async support\n\n\ud83c\udf61 OpenAI and Azure APIs all in one\n\n\u2615\ufe0f Easy life with API endpoint management\n\n\n\n## Installation\n\n```shell\npip3 install handyllm\n```\n\nor, install from the Github repo to get latest updates:\n\n```shell\npip3 install git+https://github.com/atomiechen/handyllm.git\n```\n\nPlease check [HandyLLM VSCode extension](https://marketplace.visualstudio.com/items?itemName=atomiechen.handyllm) for editor support.\n\n\n\n## Documentation\n\nPlease check out our [wiki](https://github.com/atomiechen/HandyLLM/wiki) for comprehensive guides ^_^\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A handy toolkit for using LLM.",
    "version": "0.7.1",
    "project_urls": {
        "Bug Tracker": "https://github.com/atomiechen/HandyLLM/issues",
        "Homepage": "https://github.com/atomiechen/HandyLLM"
    },
    "split_keywords": [
        "llm",
        " large language model",
        " prompt",
        " openai",
        " api"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b6f18969b55e9ab16f32d992d9ef6290b0520ba222504a3ee52b834456bbe794",
                "md5": "d1d0223361cba2b795b62a0390c9820e",
                "sha256": "d51e213599c756b63a25b1ac654dd29f7ab34b91e198d481985ec94c088d7c16"
            },
            "downloads": -1,
            "filename": "HandyLLM-0.7.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d1d0223361cba2b795b62a0390c9820e",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 25823,
            "upload_time": "2024-05-07T12:11:51",
            "upload_time_iso_8601": "2024-05-07T12:11:51.538690Z",
            "url": "https://files.pythonhosted.org/packages/b6/f1/8969b55e9ab16f32d992d9ef6290b0520ba222504a3ee52b834456bbe794/HandyLLM-0.7.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f0e93c1ba3d493c2d19ad818a4af4ffb205ac52b0eab7a7f9be8a7ac24fdead6",
                "md5": "0405cd130f3fdc2645683b95a7c33541",
                "sha256": "a3eab7190eaa87460373bc93b7c9c139ecdd49d40aab789c67e2458f531ebb18"
            },
            "downloads": -1,
            "filename": "handyllm-0.7.1.tar.gz",
            "has_sig": false,
            "md5_digest": "0405cd130f3fdc2645683b95a7c33541",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 24110,
            "upload_time": "2024-05-07T12:11:53",
            "upload_time_iso_8601": "2024-05-07T12:11:53.003395Z",
            "url": "https://files.pythonhosted.org/packages/f0/e9/3c1ba3d493c2d19ad818a4af4ffb205ac52b0eab7a7f9be8a7ac24fdead6/handyllm-0.7.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-07 12:11:53",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "atomiechen",
    "github_project": "HandyLLM",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "handyllm"
}
        
Elapsed time: 0.25059s