aihandlerwindows


Nameaihandlerwindows JSON
Version 1.8.19 PyPI version JSON
download
home_pagehttps://github.com/Capsize-Games/aihandler/tree/develop-windows
SummaryAI Handler: An engine which wraps certain huggingface models
upload_time2023-03-31 16:58:13
maintainer
docs_urlNone
authorCapsize LLC
requires_python>=3.10.0
licenseAGPL-3.0
keywords ai chatbot chat ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AI Handler
[![Upload Python Package](https://github.com/Capsize-Games/aihandler/actions/workflows/python-publish.yml/badge.svg)](https://github.com/Capsize-Games/aihandler/actions/workflows/python-publish.yml)
[![Discord](https://img.shields.io/discord/839511291466219541?color=5865F2&logo=discord&logoColor=white)](https://discord.gg/PUVDDCJ7gz)
![GitHub](https://img.shields.io/github/license/Capsize-Games/aihandler)
![GitHub last commit](https://img.shields.io/github/last-commit/Capsize-Games/aihandler)
![GitHub issues](https://img.shields.io/github/issues/Capsize-Games/aihandler)
![GitHub closed issues](https://img.shields.io/github/issues-closed/Capsize-Games/aihandler)
![GitHub pull requests](https://img.shields.io/github/issues-pr/Capsize-Games/aihandler)
![GitHub closed pull requests](https://img.shields.io/github/issues-pr-closed/Capsize-Games/aihandler)

This is a simple framework for running AI models. It makes use of the huggingface API
which gives you a queue, threading, a simple API, and the ability to run Stable Diffusion and LLMs seamlessly
from your local hardware.

This is not intended to be used as a standalone application.

It can easily be extended and used to power interfaces or it can be run from the command line.

AI Handler is a work in progress. It powers two projects at the moment, but may not be ready for general use.

## Installation

This is a work in progress.

## Pre-requisites

System requirements

- Windows 10+
- Python 3.10.8
- pip 23.0.1
- CUDA toolkit 11.7
- CUDNN 8.6.0.163
- Cuda capable GPU
- 16gb+ ram

Install
```
pip install torch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 --index-url https://download.pytorch.org/whl/cu117
pip install https://github.com/w4ffl35/diffusers/archive/refs/tags/v0.14.0.ckpt_fix.tar.gz
pip install https://github.com/w4ffl35/transformers/archive/refs/tags/tensor_fix-v1.0.2.tar.gz
pip install https://github.com/acpopescu/bitsandbytes/releases/download/v0.37.2-win.0/bitsandbytes-0.37.2-py3-none-any.whl
pip install aihandlerwindows
```

#### Optional

These are optional instructions for installing TensorRT and Deepspeed for Windows

##### Install Tensor RT:

1. Download TensorRT-8.4.3.1.Windows10.x86_64.cuda-11.6.cudnn8.4
2. Git clone TensorRT 8.4.3.1
3. Follow their instructions to build TensorRT-8.4.3.1 python wheel
4. Install TensorRT `pip install tensorrt-*.whl`
 
##### Install Deepspeed:

1. Git clone Deepspeed 0.8.1
2. Follow their instructions to build Deepspeed python wheel
3. Install Deepspeed `pip install deepspeed-*.whl

---

## Environment variables

- `AIRUNNER_ENVIRONMENT` - `dev` or `prod`. Defaults to `dev`. This controls the LOG_LEVEL
- `LOG_LEVEL` - `FATAL` for production, `DEBUG` for development. Override this to force a log level

### Huggingface variables

#### Offline mode

These environment variables keep you offline until you need to download a model. This prevents unwanted online access and speeds up usage of huggingface libraries.

- `DISABLE_TELEMETRY` Keep this set to 1 at all times. Huggingface collects minimal telemetry when downloading a model from their repository but this will keep it disabled. [See more info in this github thread](https://github.com/huggingface/diffusers/pull/1833#issuecomment-1368484414)
- `HF_HUB_OFFLINE` When loading a diffusers model, huggingface libraries will attempt to download an updated cache before running the model. This prevents that check from happening (long with a boolean passed to `load_pretrained` see the runner.py file for examples)
- `TRANSFORMERS_OFFLINE` Similar to `HF_HUB_OFFLINE` but for transformers models

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/Capsize-Games/aihandler/tree/develop-windows",
    "name": "aihandlerwindows",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10.0",
    "maintainer_email": "",
    "keywords": "ai,chatbot,chat,ai",
    "author": "Capsize LLC",
    "author_email": "contact@capsize.gg",
    "download_url": "https://files.pythonhosted.org/packages/1d/17/a7c78421b088d0471fee6bccd560ea06883eef02e761a322fc21f8d1d3a4/aihandlerwindows-1.8.19.tar.gz",
    "platform": null,
    "description": "# AI Handler\n[![Upload Python Package](https://github.com/Capsize-Games/aihandler/actions/workflows/python-publish.yml/badge.svg)](https://github.com/Capsize-Games/aihandler/actions/workflows/python-publish.yml)\n[![Discord](https://img.shields.io/discord/839511291466219541?color=5865F2&logo=discord&logoColor=white)](https://discord.gg/PUVDDCJ7gz)\n![GitHub](https://img.shields.io/github/license/Capsize-Games/aihandler)\n![GitHub last commit](https://img.shields.io/github/last-commit/Capsize-Games/aihandler)\n![GitHub issues](https://img.shields.io/github/issues/Capsize-Games/aihandler)\n![GitHub closed issues](https://img.shields.io/github/issues-closed/Capsize-Games/aihandler)\n![GitHub pull requests](https://img.shields.io/github/issues-pr/Capsize-Games/aihandler)\n![GitHub closed pull requests](https://img.shields.io/github/issues-pr-closed/Capsize-Games/aihandler)\n\nThis is a simple framework for running AI models. It makes use of the huggingface API\nwhich gives you a queue, threading, a simple API, and the ability to run Stable Diffusion and LLMs seamlessly\nfrom your local hardware.\n\nThis is not intended to be used as a standalone application.\n\nIt can easily be extended and used to power interfaces or it can be run from the command line.\n\nAI Handler is a work in progress. It powers two projects at the moment, but may not be ready for general use.\n\n## Installation\n\nThis is a work in progress.\n\n## Pre-requisites\n\nSystem requirements\n\n- Windows 10+\n- Python 3.10.8\n- pip 23.0.1\n- CUDA toolkit 11.7\n- CUDNN 8.6.0.163\n- Cuda capable GPU\n- 16gb+ ram\n\nInstall\n```\npip install torch==1.13.1 torchvision==0.14.1 torchaudio==0.13.1 --index-url https://download.pytorch.org/whl/cu117\npip install https://github.com/w4ffl35/diffusers/archive/refs/tags/v0.14.0.ckpt_fix.tar.gz\npip install https://github.com/w4ffl35/transformers/archive/refs/tags/tensor_fix-v1.0.2.tar.gz\npip install https://github.com/acpopescu/bitsandbytes/releases/download/v0.37.2-win.0/bitsandbytes-0.37.2-py3-none-any.whl\npip install aihandlerwindows\n```\n\n#### Optional\n\nThese are optional instructions for installing TensorRT and Deepspeed for Windows\n\n##### Install Tensor RT:\n\n1. Download TensorRT-8.4.3.1.Windows10.x86_64.cuda-11.6.cudnn8.4\n2. Git clone TensorRT 8.4.3.1\n3. Follow their instructions to build TensorRT-8.4.3.1 python wheel\n4. Install TensorRT `pip install tensorrt-*.whl`\n \n##### Install Deepspeed:\n\n1. Git clone Deepspeed 0.8.1\n2. Follow their instructions to build Deepspeed python wheel\n3. Install Deepspeed `pip install deepspeed-*.whl\n\n---\n\n## Environment variables\n\n- `AIRUNNER_ENVIRONMENT` - `dev` or `prod`. Defaults to `dev`. This controls the LOG_LEVEL\n- `LOG_LEVEL` - `FATAL` for production, `DEBUG` for development. Override this to force a log level\n\n### Huggingface variables\n\n#### Offline mode\n\nThese environment variables keep you offline until you need to download a model. This prevents unwanted online access and speeds up usage of huggingface libraries.\n\n- `DISABLE_TELEMETRY` Keep this set to 1 at all times. Huggingface collects minimal telemetry when downloading a model from their repository but this will keep it disabled. [See more info in this github thread](https://github.com/huggingface/diffusers/pull/1833#issuecomment-1368484414)\n- `HF_HUB_OFFLINE` When loading a diffusers model, huggingface libraries will attempt to download an updated cache before running the model. This prevents that check from happening (long with a boolean passed to `load_pretrained` see the runner.py file for examples)\n- `TRANSFORMERS_OFFLINE` Similar to `HF_HUB_OFFLINE` but for transformers models\n",
    "bugtrack_url": null,
    "license": "AGPL-3.0",
    "summary": "AI Handler: An engine which wraps certain huggingface models",
    "version": "1.8.19",
    "split_keywords": [
        "ai",
        "chatbot",
        "chat",
        "ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3b23609d849a94641a554235b235510e7299a467b385e6ab0e0d1e015ce4630f",
                "md5": "93c491563997f0ad641adcde111cf399",
                "sha256": "d9ef24fd2199a856bf1c32d0da4d203234b88855031265dd51d65911e470e103"
            },
            "downloads": -1,
            "filename": "aihandlerwindows-1.8.19-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "93c491563997f0ad641adcde111cf399",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10.0",
            "size": 42363,
            "upload_time": "2023-03-31T16:58:11",
            "upload_time_iso_8601": "2023-03-31T16:58:11.703152Z",
            "url": "https://files.pythonhosted.org/packages/3b/23/609d849a94641a554235b235510e7299a467b385e6ab0e0d1e015ce4630f/aihandlerwindows-1.8.19-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1d17a7c78421b088d0471fee6bccd560ea06883eef02e761a322fc21f8d1d3a4",
                "md5": "0cb8704fe3a687875febdb83c73b0dca",
                "sha256": "0b7fdd8219a1382fb9c6ad7e76028f0ef9e7ae968e4bd46d86a05f02118057d0"
            },
            "downloads": -1,
            "filename": "aihandlerwindows-1.8.19.tar.gz",
            "has_sig": false,
            "md5_digest": "0cb8704fe3a687875febdb83c73b0dca",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10.0",
            "size": 40417,
            "upload_time": "2023-03-31T16:58:13",
            "upload_time_iso_8601": "2023-03-31T16:58:13.424102Z",
            "url": "https://files.pythonhosted.org/packages/1d/17/a7c78421b088d0471fee6bccd560ea06883eef02e761a322fc21f8d1d3a4/aihandlerwindows-1.8.19.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-31 16:58:13",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "lcname": "aihandlerwindows"
}
        
Elapsed time: 0.05358s