wtu-mlflow-triton-plugin


Namewtu-mlflow-triton-plugin JSON
Version 0.0.15 PyPI version JSON
download
home_pagehttps://github.com/WIM-Corporation/w-train-utils-mlflow-triton-plugin
SummaryW-Train Utils for MLflow Triton Plugin
upload_time2024-10-30 10:43:33
maintainerNone
docs_urlNone
authorhbjs
requires_python>=3.7
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # w-train-utils-mlflow-triton-plugin

## 가상환경 설정

```sh
pyenv install 3.8.18
pyenv virtualenv 3.8.18 wtrainclient3.8
pyenv activate wtrainclient3.8
```

---

## Triton Inference Server 실행

```sh
$ docker run --rm -p8000:8000 -p8001:8001 -p8002:8002 \
    -e AWS_ACCESS_KEY_ID=<AccessKey> \
    -e AWS_SECRET_ACCESS_KEY=<SecretKey> \
    nvcr.io/nvidia/tritonserver:24.01-py3 \
    tritonserver --model-repository=s3://https://kitech-minio-api.wimcorp.dev:443/triton \
    --model-control-mode=explicit \
    --log-verbose=1
```

---

## 환경 변수 설정

프로젝트를 실행하기 전에 아래의 환경 변수들을 설정해야 합니다:

| 환경변수               | 설명                                                | 예시                              |
| ---------------------- | --------------------------------------------------- | --------------------------------- |
| MLFLOW_S3_ENDPOINT_URL | MLflow가 저장소로 사용하고있는 MinIO 엔드포인트 URL | http://localhost:9000             |
| MLFLOW_TRACKING_URI    | MLflow 트래킹 서버의 URI                            | http://localhost:5001             |
| AWS_ACCESS_KEY_ID      | MinIO 서버 접근을 위한 AWS 호환 액세스 키           | minio                             |
| AWS_SECRET_ACCESS_KEY  | MinIO 서버 접근을 위한 AWS 호환 시크릿 액세스 키    | miniostorage                      |
| TRITON_URL             | Triton Inference Server 의 grpc 엔드포인트 URL      | http://localhost:8001             |
| TRITON_MODEL_REPO      | Triton Inference Server 의 모델저장소 URL           | s3://http://localhost:9000/triton |

---

## 패키지 빌드 및 업로드

```sh
# 필요한 의존성 설치
pip install wheel setuptools twine
```

```sh
vi ~/.pypirc

[distutils]
index-servers =
    pypi
    pypi-repository

[pypi]
  username = __token__
  password = <token>

[pypi-repository]
repository: https://<domain>/repository/<pypi-hosted>/
username: <username>
password: <password>
```

```sh
sh scripts/build.sh
sh scripts/deploy.sh
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/WIM-Corporation/w-train-utils-mlflow-triton-plugin",
    "name": "wtu-mlflow-triton-plugin",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": null,
    "author": "hbjs",
    "author_email": "hbjs97@naver.com",
    "download_url": "https://files.pythonhosted.org/packages/62/4b/cbed4f01ae5d92ed2ec8d48cafae8ca8fa76a64fe08f4a7a1ee60fc92821/wtu-mlflow-triton-plugin-0.0.15.tar.gz",
    "platform": null,
    "description": "# w-train-utils-mlflow-triton-plugin\n\n## \uac00\uc0c1\ud658\uacbd \uc124\uc815\n\n```sh\npyenv install 3.8.18\npyenv virtualenv 3.8.18 wtrainclient3.8\npyenv activate wtrainclient3.8\n```\n\n---\n\n## Triton Inference Server \uc2e4\ud589\n\n```sh\n$ docker run --rm -p8000:8000 -p8001:8001 -p8002:8002 \\\n    -e AWS_ACCESS_KEY_ID=<AccessKey> \\\n    -e AWS_SECRET_ACCESS_KEY=<SecretKey> \\\n    nvcr.io/nvidia/tritonserver:24.01-py3 \\\n    tritonserver --model-repository=s3://https://kitech-minio-api.wimcorp.dev:443/triton \\\n    --model-control-mode=explicit \\\n    --log-verbose=1\n```\n\n---\n\n## \ud658\uacbd \ubcc0\uc218 \uc124\uc815\n\n\ud504\ub85c\uc81d\ud2b8\ub97c \uc2e4\ud589\ud558\uae30 \uc804\uc5d0 \uc544\ub798\uc758 \ud658\uacbd \ubcc0\uc218\ub4e4\uc744 \uc124\uc815\ud574\uc57c \ud569\ub2c8\ub2e4:\n\n| \ud658\uacbd\ubcc0\uc218               | \uc124\uba85                                                | \uc608\uc2dc                              |\n| ---------------------- | --------------------------------------------------- | --------------------------------- |\n| MLFLOW_S3_ENDPOINT_URL | MLflow\uac00 \uc800\uc7a5\uc18c\ub85c \uc0ac\uc6a9\ud558\uace0\uc788\ub294 MinIO \uc5d4\ub4dc\ud3ec\uc778\ud2b8 URL | http://localhost:9000             |\n| MLFLOW_TRACKING_URI    | MLflow \ud2b8\ub798\ud0b9 \uc11c\ubc84\uc758 URI                            | http://localhost:5001             |\n| AWS_ACCESS_KEY_ID      | MinIO \uc11c\ubc84 \uc811\uadfc\uc744 \uc704\ud55c AWS \ud638\ud658 \uc561\uc138\uc2a4 \ud0a4           | minio                             |\n| AWS_SECRET_ACCESS_KEY  | MinIO \uc11c\ubc84 \uc811\uadfc\uc744 \uc704\ud55c AWS \ud638\ud658 \uc2dc\ud06c\ub9bf \uc561\uc138\uc2a4 \ud0a4    | miniostorage                      |\n| TRITON_URL             | Triton Inference Server \uc758 grpc \uc5d4\ub4dc\ud3ec\uc778\ud2b8 URL      | http://localhost:8001             |\n| TRITON_MODEL_REPO      | Triton Inference Server \uc758 \ubaa8\ub378\uc800\uc7a5\uc18c URL           | s3://http://localhost:9000/triton |\n\n---\n\n## \ud328\ud0a4\uc9c0 \ube4c\ub4dc \ubc0f \uc5c5\ub85c\ub4dc\n\n```sh\n# \ud544\uc694\ud55c \uc758\uc874\uc131 \uc124\uce58\npip install wheel setuptools twine\n```\n\n```sh\nvi ~/.pypirc\n\n[distutils]\nindex-servers =\n    pypi\n    pypi-repository\n\n[pypi]\n  username = __token__\n  password = <token>\n\n[pypi-repository]\nrepository: https://<domain>/repository/<pypi-hosted>/\nusername: <username>\npassword: <password>\n```\n\n```sh\nsh scripts/build.sh\nsh scripts/deploy.sh\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "W-Train Utils for MLflow Triton Plugin",
    "version": "0.0.15",
    "project_urls": {
        "Homepage": "https://github.com/WIM-Corporation/w-train-utils-mlflow-triton-plugin"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a4480f2be6b932e72f2a02fb74a6914b3fd30cc53cb5e680a6bb6686144e718b",
                "md5": "1cbd2873e6236e4842566c7a511580f7",
                "sha256": "912b3339a6613500bb1f00ca7ac3bedd46e1bab20ada915a316fa0cd8399981f"
            },
            "downloads": -1,
            "filename": "wtu_mlflow_triton_plugin-0.0.15-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1cbd2873e6236e4842566c7a511580f7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 13260,
            "upload_time": "2024-10-30T10:43:32",
            "upload_time_iso_8601": "2024-10-30T10:43:32.589020Z",
            "url": "https://files.pythonhosted.org/packages/a4/48/0f2be6b932e72f2a02fb74a6914b3fd30cc53cb5e680a6bb6686144e718b/wtu_mlflow_triton_plugin-0.0.15-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "624bcbed4f01ae5d92ed2ec8d48cafae8ca8fa76a64fe08f4a7a1ee60fc92821",
                "md5": "95e28079f3abd8a3b75535e74127132b",
                "sha256": "e6d1db9bc3de31f8ee2ef0ec2e612c92f4be6a9cb41da39de5d6b6840693db59"
            },
            "downloads": -1,
            "filename": "wtu-mlflow-triton-plugin-0.0.15.tar.gz",
            "has_sig": false,
            "md5_digest": "95e28079f3abd8a3b75535e74127132b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 11673,
            "upload_time": "2024-10-30T10:43:33",
            "upload_time_iso_8601": "2024-10-30T10:43:33.845610Z",
            "url": "https://files.pythonhosted.org/packages/62/4b/cbed4f01ae5d92ed2ec8d48cafae8ca8fa76a64fe08f4a7a1ee60fc92821/wtu-mlflow-triton-plugin-0.0.15.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-30 10:43:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "WIM-Corporation",
    "github_project": "w-train-utils-mlflow-triton-plugin",
    "github_not_found": true,
    "lcname": "wtu-mlflow-triton-plugin"
}
        
Elapsed time: 0.46419s