# Py-TGI (Py-TXI at this point xD)
[![PyPI version](https://badge.fury.io/py/py-tgi.svg)](https://badge.fury.io/py/py-tgi)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/py-tgi)](https://pypi.org/project/py-tgi/)
[![PyPI - Format](https://img.shields.io/pypi/format/py-tgi)](https://pypi.org/project/py-tgi/)
[![Downloads](https://pepy.tech/badge/py-tgi)](https://pepy.tech/project/py-tgi)
[![PyPI - License](https://img.shields.io/pypi/l/py-tgi)](https://pypi.org/project/py-tgi/)
[![Tests](https://github.com/IlyasMoutawwakil/py-tgi/actions/workflows/tests.yaml/badge.svg)](https://github.com/IlyasMoutawwakil/py-tgi/actions/workflows/tests.yaml)
Py-TGI is a Python wrapper around [Text-Generation-Inference](https://github.com/huggingface/text-generation-inference) and [Text-Embedding-Inference](https://github.com/huggingface/text-embeddings-inference) that enables creating and running TGI/TEI instances through the awesome `docker-py` in a similar style to Transformers API.
## Installation
```bash
pip install py-tgi
```
Py-TGI is designed to be used in a similar way to Transformers API. We use `docker-py` (instead of a dirty `subprocess` solution) so that the containers you run are linked to the main process and are stopped automatically when your code finishes or fails.
## Usage
Here's an example of how to use it:
```python
from py_tgi import TGI, is_nvidia_system, is_rocm_system
llm = TGI(
model="NousResearch/Llama-2-7b-hf",
devices=["/dev/kfd", "/dev/dri"] if is_rocm_system() else None,
gpus="all" if is_nvidia_system() else None,
)
output = llm.generate(["Hi, I'm a language model", "I'm fine, how are you?"])
print(output)
```
Output: ```[" and I'm here to help you with any questions you have. What can I help you with", "\nUser 0: I'm doing well, thanks for asking. I'm just a"]```
```python
from py_tgi import TEI, is_nvidia_system
embed = TEI(
model="BAAI/bge-large-en-v1.5",
dtype="float16",
pooling="mean",
gpus="all" if is_nvidia_system() else None,
)
output = embed.encode(["Hi, I'm an embedding model", "I'm fine, how are you?"])
print(output)
```
Output: ```[array([[ 0.01058742, -0.01588806, -0.03487622, ..., -0.01613717,
0.01772875, -0.02237891]], dtype=float32), array([[ 0.02815401, -0.02892136, -0.0536355 , ..., 0.01225784,
-0.00241452, -0.02836569]], dtype=float32)]```
That's it! Now you can write your Python scripts using the power of TGI and TEI without having to worry about the underlying Docker containers.
Raw data
{
"_id": null,
"home_page": "https://github.com/IlyasMoutawwakil/py-tgi",
"name": "py-tgi",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "tgi,llm,tei,embedding,huggingface,docker,python",
"author": "Ilyas Moutawwakil",
"author_email": "ilyas.moutawwakil@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/9e/ce/0520848b1eafef9ebbfddbfe9c9538c562dd1e632ff07e76e309395ec368/py-tgi-0.2.0.tar.gz",
"platform": "linux",
"description": "# Py-TGI (Py-TXI at this point xD)\n\n[![PyPI version](https://badge.fury.io/py/py-tgi.svg)](https://badge.fury.io/py/py-tgi)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/py-tgi)](https://pypi.org/project/py-tgi/)\n[![PyPI - Format](https://img.shields.io/pypi/format/py-tgi)](https://pypi.org/project/py-tgi/)\n[![Downloads](https://pepy.tech/badge/py-tgi)](https://pepy.tech/project/py-tgi)\n[![PyPI - License](https://img.shields.io/pypi/l/py-tgi)](https://pypi.org/project/py-tgi/)\n[![Tests](https://github.com/IlyasMoutawwakil/py-tgi/actions/workflows/tests.yaml/badge.svg)](https://github.com/IlyasMoutawwakil/py-tgi/actions/workflows/tests.yaml)\n\nPy-TGI is a Python wrapper around [Text-Generation-Inference](https://github.com/huggingface/text-generation-inference) and [Text-Embedding-Inference](https://github.com/huggingface/text-embeddings-inference) that enables creating and running TGI/TEI instances through the awesome `docker-py` in a similar style to Transformers API.\n\n## Installation\n\n```bash\npip install py-tgi\n```\n\nPy-TGI is designed to be used in a similar way to Transformers API. We use `docker-py` (instead of a dirty `subprocess` solution) so that the containers you run are linked to the main process and are stopped automatically when your code finishes or fails.\n\n## Usage\n\nHere's an example of how to use it:\n\n```python\nfrom py_tgi import TGI, is_nvidia_system, is_rocm_system\n\nllm = TGI(\n model=\"NousResearch/Llama-2-7b-hf\",\n devices=[\"/dev/kfd\", \"/dev/dri\"] if is_rocm_system() else None,\n gpus=\"all\" if is_nvidia_system() else None,\n)\noutput = llm.generate([\"Hi, I'm a language model\", \"I'm fine, how are you?\"])\nprint(output)\n```\n\nOutput: ```[\" and I'm here to help you with any questions you have. What can I help you with\", \"\\nUser 0: I'm doing well, thanks for asking. I'm just a\"]```\n\n```python\nfrom py_tgi import TEI, is_nvidia_system\n\nembed = TEI(\n model=\"BAAI/bge-large-en-v1.5\",\n dtype=\"float16\",\n pooling=\"mean\",\n gpus=\"all\" if is_nvidia_system() else None,\n)\noutput = embed.encode([\"Hi, I'm an embedding model\", \"I'm fine, how are you?\"])\nprint(output)\n```\n\nOutput: ```[array([[ 0.01058742, -0.01588806, -0.03487622, ..., -0.01613717,\n 0.01772875, -0.02237891]], dtype=float32), array([[ 0.02815401, -0.02892136, -0.0536355 , ..., 0.01225784,\n -0.00241452, -0.02836569]], dtype=float32)]```\n\nThat's it! Now you can write your Python scripts using the power of TGI and TEI without having to worry about the underlying Docker containers.\n",
"bugtrack_url": null,
"license": "",
"summary": "A Python wrapper around TGI and TEI servers",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/IlyasMoutawwakil/py-tgi"
},
"split_keywords": [
"tgi",
"llm",
"tei",
"embedding",
"huggingface",
"docker",
"python"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2a2f5c54b04479ccdc48e178b06894345d7b146b94906fe0bd5fcf763af9e326",
"md5": "34fd047fe980c73eb26ff52437d27e69",
"sha256": "0136e097fe9be67be2a0be1170a1f8ca9dffe35f2a288d86e6e127fdb37ecfa5"
},
"downloads": -1,
"filename": "py_tgi-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "34fd047fe980c73eb26ff52437d27e69",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 8398,
"upload_time": "2024-02-22T16:23:15",
"upload_time_iso_8601": "2024-02-22T16:23:15.592733Z",
"url": "https://files.pythonhosted.org/packages/2a/2f/5c54b04479ccdc48e178b06894345d7b146b94906fe0bd5fcf763af9e326/py_tgi-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9ece0520848b1eafef9ebbfddbfe9c9538c562dd1e632ff07e76e309395ec368",
"md5": "8cdff72734f67a56d446d0cde325f724",
"sha256": "8b4b2026a4ff711fb028de8265539467c20460de2c0de4979f6d6e7fbe30afed"
},
"downloads": -1,
"filename": "py-tgi-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "8cdff72734f67a56d446d0cde325f724",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 7259,
"upload_time": "2024-02-22T16:23:21",
"upload_time_iso_8601": "2024-02-22T16:23:21.420770Z",
"url": "https://files.pythonhosted.org/packages/9e/ce/0520848b1eafef9ebbfddbfe9c9538c562dd1e632ff07e76e309395ec368/py-tgi-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-22 16:23:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "IlyasMoutawwakil",
"github_project": "py-tgi",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "py-tgi"
}