Name | triton-copilot JSON |
Version |
0.1.4
JSON |
| download |
home_page | None |
Summary | None |
upload_time | 2024-06-03 15:10:25 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.8 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Triton-Server Co-Pilot
Triton-Server Co-Pilot is an innovative tool designed to streamline the process of converting any model code into Triton-Server compatible code, simplifying deployment on NVIDIA Triton Inference Server. This project automatically generates necessary configuration files (`config.pbtxt`) and custom wrapper code (`model.py`), among others, facilitating seamless integration and deployment of AI models in production environments.
## Features and Benefits
- **Automatic Code Conversion**: Convert your model code into Triton-Server compatible code.
- **Configuration File Generation**: Automatically generates `config.pbtxt` files tailored to your specific model requirements.
## Prerequisites
- Your model code ready for conversion
## Installation
1. Clone the Triton-Server Co-Pilot repository:
```bash
git clone https://github.com/inferless/triton-copilot.git
cd triton-copilot
poetry build
pip3 install ./dist/triton_copilot-0.1.0-py3-none-any.whl --force-reinstall
### 6. File Structure
```markdown
## File Structure
- `config.pbtxt`: Configuration file for Triton Server, specifying model parameters.
- `model.py`: Custom wrapper code for your model, ensuring compatibility with Triton Server.
```
## Contributing
Contributions to Triton-Server Co-Pilot are welcome! Please refer to our contribution guidelines for more information on how to submit issues, make pull requests, and more.
## License
This project is licensed under the [MIT License](LICENSE). See the LICENSE file for more details.
## Contact
For support or queries, please contact us at [hello@inferless.com].
Raw data
{
"_id": null,
"home_page": null,
"name": "triton-copilot",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/44/28/dcdb9d7a7b1be88fba960de01dafd832e031c1896d8437e5398ae682f259/triton_copilot-0.1.4.tar.gz",
"platform": null,
"description": "# Triton-Server Co-Pilot\n\nTriton-Server Co-Pilot is an innovative tool designed to streamline the process of converting any model code into Triton-Server compatible code, simplifying deployment on NVIDIA Triton Inference Server. This project automatically generates necessary configuration files (`config.pbtxt`) and custom wrapper code (`model.py`), among others, facilitating seamless integration and deployment of AI models in production environments.\n\n## Features and Benefits\n\n- **Automatic Code Conversion**: Convert your model code into Triton-Server compatible code.\n- **Configuration File Generation**: Automatically generates `config.pbtxt` files tailored to your specific model requirements.\n\n## Prerequisites\n- Your model code ready for conversion\n\n## Installation\n\n1. Clone the Triton-Server Co-Pilot repository:\n ```bash\n git clone https://github.com/inferless/triton-copilot.git\n cd triton-copilot\n poetry build\n pip3 install ./dist/triton_copilot-0.1.0-py3-none-any.whl --force-reinstall \n\n### 6. File Structure\n\n```markdown\n## File Structure\n\n- `config.pbtxt`: Configuration file for Triton Server, specifying model parameters.\n- `model.py`: Custom wrapper code for your model, ensuring compatibility with Triton Server.\n```\n\n## Contributing\n\nContributions to Triton-Server Co-Pilot are welcome! Please refer to our contribution guidelines for more information on how to submit issues, make pull requests, and more.\n\n## License\n\nThis project is licensed under the [MIT License](LICENSE). See the LICENSE file for more details.\n\n## Contact\n\nFor support or queries, please contact us at [hello@inferless.com].\n",
"bugtrack_url": null,
"license": null,
"summary": null,
"version": "0.1.4",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e5810532d72e771621112c865c0ad386317232117ed9d4831bbc5b865022a507",
"md5": "f2eec6854d92427c7733cdd5d68000b1",
"sha256": "1f7604f4dc5b4de851c3519f80382d7ef41030a1be70ef9eb59e3d3a8fb953da"
},
"downloads": -1,
"filename": "triton_copilot-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f2eec6854d92427c7733cdd5d68000b1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8",
"size": 17395,
"upload_time": "2024-06-03T15:10:23",
"upload_time_iso_8601": "2024-06-03T15:10:23.665486Z",
"url": "https://files.pythonhosted.org/packages/e5/81/0532d72e771621112c865c0ad386317232117ed9d4831bbc5b865022a507/triton_copilot-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "4428dcdb9d7a7b1be88fba960de01dafd832e031c1896d8437e5398ae682f259",
"md5": "9374141bbce1640eb4efbf962aa09df2",
"sha256": "0822247f3113923457bc047fa4b0f62e389100591a558a8454b45aa808bcff02"
},
"downloads": -1,
"filename": "triton_copilot-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "9374141bbce1640eb4efbf962aa09df2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8",
"size": 14036,
"upload_time": "2024-06-03T15:10:25",
"upload_time_iso_8601": "2024-06-03T15:10:25.532483Z",
"url": "https://files.pythonhosted.org/packages/44/28/dcdb9d7a7b1be88fba960de01dafd832e031c1896d8437e5398ae682f259/triton_copilot-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-03 15:10:25",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "triton-copilot"
}