trtruntime


Nametrtruntime JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryA lightweight TensorRT inference runtime for Python, inspired by onnxruntime
upload_time2025-08-10 20:12:58
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords deepstream gstreamer cli onnx automation ai artificial intelligence computer vision edge ai intelligent edge craftifai intelligent edge systems ies nvidia jetson inference opencv triton inference server tensorrt onnxruntime
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # trtruntime

`trtruntime` is a lightweight Python package that provides a TensorRT inference runtime similar in API style to [onnxruntime](https://onnxruntime.ai/). It allows easy loading and running of TensorRT engines with a clean and simple interface.

## Features

- Load serialized TensorRT engine files (`*.engine`) or plan files (`*.plan`)
- Automatically handle CUDA memory bindings and streams
- Simple API modeled after onnxruntime's `InferenceSession`
- Supports querying input and output tensor metadata
- Compatible with TensorRT, PyCUDA, and NumPy

## Installation

```bash
pip install trtruntime
````

> **Note:** You need to have TensorRT and PyCUDA installed on your system.

## Usage

```python
import numpy as np
from trtruntime import InferenceSession

# Create inference session with your TensorRT engine file
sess = InferenceSession("model.engine")

# Prepare input feed as dictionary {input_name: numpy_array}
input_feed = {
    "input_1": np.random.rand(1, 3, 224, 224).astype(np.float32),
}

# Run inference
outputs = sess.run(output_names=None, input_feed=input_feed)

# outputs is a list of numpy arrays corresponding to requested outputs
print(outputs)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "trtruntime",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "deepstream, gstreamer, cli, onnx, automation, ai, artificial intelligence, computer vision, edge ai, intelligent edge, craftifai, intelligent edge systems, ies, nvidia, jetson, inference, opencv, triton inference server, tensorrt, onnxruntime",
    "author": null,
    "author_email": "Mohd Yusuf <yusuf.intelligentedgesystems@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/c6/e7/c788104090a4ef314dce751dca3f9f2ee3b11cc04ca410cb717c82c7bd91/trtruntime-0.1.0.tar.gz",
    "platform": null,
    "description": "# trtruntime\n\n`trtruntime` is a lightweight Python package that provides a TensorRT inference runtime similar in API style to [onnxruntime](https://onnxruntime.ai/). It allows easy loading and running of TensorRT engines with a clean and simple interface.\n\n## Features\n\n- Load serialized TensorRT engine files (`*.engine`) or plan files (`*.plan`)\n- Automatically handle CUDA memory bindings and streams\n- Simple API modeled after onnxruntime's `InferenceSession`\n- Supports querying input and output tensor metadata\n- Compatible with TensorRT, PyCUDA, and NumPy\n\n## Installation\n\n```bash\npip install trtruntime\n````\n\n> **Note:** You need to have TensorRT and PyCUDA installed on your system.\n\n## Usage\n\n```python\nimport numpy as np\nfrom trtruntime import InferenceSession\n\n# Create inference session with your TensorRT engine file\nsess = InferenceSession(\"model.engine\")\n\n# Prepare input feed as dictionary {input_name: numpy_array}\ninput_feed = {\n    \"input_1\": np.random.rand(1, 3, 224, 224).astype(np.float32),\n}\n\n# Run inference\noutputs = sess.run(output_names=None, input_feed=input_feed)\n\n# outputs is a list of numpy arrays corresponding to requested outputs\nprint(outputs)\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A lightweight TensorRT inference runtime for Python, inspired by onnxruntime",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [
        "deepstream",
        " gstreamer",
        " cli",
        " onnx",
        " automation",
        " ai",
        " artificial intelligence",
        " computer vision",
        " edge ai",
        " intelligent edge",
        " craftifai",
        " intelligent edge systems",
        " ies",
        " nvidia",
        " jetson",
        " inference",
        " opencv",
        " triton inference server",
        " tensorrt",
        " onnxruntime"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "05fdcace1809edc4f6a8c063b2e3c18faa1bf4e31b83b017c555e1848cb8e82e",
                "md5": "43989a734bd82ac3e57b84b1e0c423c9",
                "sha256": "6d76c7a4c3848f5665d8e499163ffad7ef23b4bc3fd17db414dc4379631ebce9"
            },
            "downloads": -1,
            "filename": "trtruntime-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "43989a734bd82ac3e57b84b1e0c423c9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 3298,
            "upload_time": "2025-08-10T20:12:57",
            "upload_time_iso_8601": "2025-08-10T20:12:57.483995Z",
            "url": "https://files.pythonhosted.org/packages/05/fd/cace1809edc4f6a8c063b2e3c18faa1bf4e31b83b017c555e1848cb8e82e/trtruntime-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "c6e7c788104090a4ef314dce751dca3f9f2ee3b11cc04ca410cb717c82c7bd91",
                "md5": "604581c86d9cc0056052b8cdc6444b2e",
                "sha256": "976afbc3e5789fca0166ed3ef21f6fc7ecbae30771f6f43a5e23c0d04f83d55d"
            },
            "downloads": -1,
            "filename": "trtruntime-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "604581c86d9cc0056052b8cdc6444b2e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 3136,
            "upload_time": "2025-08-10T20:12:58",
            "upload_time_iso_8601": "2025-08-10T20:12:58.813009Z",
            "url": "https://files.pythonhosted.org/packages/c6/e7/c788104090a4ef314dce751dca3f9f2ee3b11cc04ca410cb717c82c7bd91/trtruntime-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-10 20:12:58",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "trtruntime"
}
        
Elapsed time: 2.02718s