Name | ingrain JSON |
Version |
0.2.3
JSON |
| download |
home_page | None |
Summary | Python client for the ingrain server |
upload_time | 2025-09-08 07:51:36 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.11 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Ingrain Python Client
[](https://pypi.org/project/ingrain/)

This is the Python client for the Ingrain API. It provides a simple interface to interact with the Ingrain API.
## Install
```bash
pip install ingrain
```
## Dev Setup
```bash
uv sync --dev
```
### Testing
#### Unit tests
```bash
uv run pytest
```
#### Integration tests and unit tests
This requires that Ingrain Server is running. You can start it with Docker Compose:
```yml
services:
ingrain-models:
image: owenpelliott/ingrain-models:latest
container_name: ingrain-models
ports:
- "8687:8687"
environment:
- TRITON_GRPC_URL=triton:8001
- MAX_BATCH_SIZE=16
- MODEL_INSTANCES=1
- INSTANCE_KIND=KIND_GPU # Change to KIND_CPU if using a CPU
depends_on:
- triton
volumes:
- ./model_repository:/app/model_repository
- ${HOME}/.cache/huggingface:/app/model_cache/
ingrain-inference:
image: owenpelliott/ingrain-inference:latest
container_name: ingrain-inference
ports:
- "8686:8686"
environment:
- TRITON_GRPC_URL=triton:8001
depends_on:
- triton
volumes:
- ./model_repository:/app/model_repository
triton:
image: nvcr.io/nvidia/tritonserver:25.06-py3
container_name: triton
runtime: nvidia # Remove if using a CPU
environment:
- NVIDIA_VISIBLE_DEVICES=all
shm_size: "256m"
command: >
tritonserver
--model-repository=/models
--model-control-mode=explicit
ports:
- "8000:8000"
- "8001:8001"
- "8002:8002"
volumes:
- ./model_repository:/models
restart:
unless-stopped
```
```bash
uv run pytest --integration
```
Raw data
{
"_id": null,
"home_page": null,
"name": "ingrain",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/74/71/a44723ed87a2d430ab0c05f14124108b197dfec0080e5660c1ca1d91f93e/ingrain-0.2.3.tar.gz",
"platform": null,
"description": "# Ingrain Python Client\n\n[](https://pypi.org/project/ingrain/)\n\n\nThis is the Python client for the Ingrain API. It provides a simple interface to interact with the Ingrain API.\n\n## Install\n \n```bash\npip install ingrain\n```\n\n## Dev Setup\n```bash\nuv sync --dev\n```\n\n### Testing\n\n#### Unit tests\n\n```bash\nuv run pytest\n```\n\n#### Integration tests and unit tests\n\nThis requires that Ingrain Server is running. You can start it with Docker Compose:\n\n```yml\nservices:\n ingrain-models:\n image: owenpelliott/ingrain-models:latest\n container_name: ingrain-models\n ports:\n - \"8687:8687\"\n environment:\n - TRITON_GRPC_URL=triton:8001\n - MAX_BATCH_SIZE=16\n - MODEL_INSTANCES=1\n - INSTANCE_KIND=KIND_GPU # Change to KIND_CPU if using a CPU\n depends_on:\n - triton\n volumes:\n - ./model_repository:/app/model_repository \n - ${HOME}/.cache/huggingface:/app/model_cache/\n ingrain-inference:\n image: owenpelliott/ingrain-inference:latest\n container_name: ingrain-inference\n ports:\n - \"8686:8686\"\n environment:\n - TRITON_GRPC_URL=triton:8001\n depends_on:\n - triton\n volumes:\n - ./model_repository:/app/model_repository \n triton:\n image: nvcr.io/nvidia/tritonserver:25.06-py3\n container_name: triton\n runtime: nvidia # Remove if using a CPU\n environment:\n - NVIDIA_VISIBLE_DEVICES=all\n shm_size: \"256m\"\n command: >\n tritonserver\n --model-repository=/models\n --model-control-mode=explicit\n ports:\n - \"8000:8000\"\n - \"8001:8001\"\n - \"8002:8002\"\n volumes:\n - ./model_repository:/models\n restart:\n unless-stopped\n```\n\n```bash\nuv run pytest --integration\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "Python client for the ingrain server",
"version": "0.2.3",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "50ae63019447220f97f09148de892306719bffcbe4334e13e60fc5a33788ca15",
"md5": "b1a7722af5cd1abc62abfce9d78b6e6b",
"sha256": "70ec63a4c6837c8aa0b168031aa76d6af2172888a2409bad09c4764b99262821"
},
"downloads": -1,
"filename": "ingrain-0.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b1a7722af5cd1abc62abfce9d78b6e6b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 9842,
"upload_time": "2025-09-08T07:51:35",
"upload_time_iso_8601": "2025-09-08T07:51:35.700925Z",
"url": "https://files.pythonhosted.org/packages/50/ae/63019447220f97f09148de892306719bffcbe4334e13e60fc5a33788ca15/ingrain-0.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7471a44723ed87a2d430ab0c05f14124108b197dfec0080e5660c1ca1d91f93e",
"md5": "8c28a47e576ebf56b977baacc1fe0610",
"sha256": "87763a46b0d796295fb9e9f8860f3361c80a6ea330b1ad42f8b34fdfe3cc502d"
},
"downloads": -1,
"filename": "ingrain-0.2.3.tar.gz",
"has_sig": false,
"md5_digest": "8c28a47e576ebf56b977baacc1fe0610",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 11611,
"upload_time": "2025-09-08T07:51:36",
"upload_time_iso_8601": "2025-09-08T07:51:36.761234Z",
"url": "https://files.pythonhosted.org/packages/74/71/a44723ed87a2d430ab0c05f14124108b197dfec0080e5660c1ca1d91f93e/ingrain-0.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-08 07:51:36",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "ingrain"
}