truss


Nametruss JSON
Version 0.11.1 PyPI version JSON
download
home_pageNone
SummaryA seamless bridge from model development to model delivery
upload_time2025-09-10 18:55:39
maintainerNone
docs_urlNone
authorNone
requires_python<3.14,>=3.9
licenseNone
keywords ai mlops machine learning model deployment model serving
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Truss

**The simplest way to serve AI/ML models in production**

[![PyPI version](https://badge.fury.io/py/truss.svg)](https://badge.fury.io/py/truss)
[![ci_status](https://github.com/basetenlabs/truss/actions/workflows/release.yml/badge.svg)](https://github.com/basetenlabs/truss/actions/workflows/release.yml)

## Why Truss?

* **Write once, run anywhere:** Package and test model code, weights, and dependencies with a model server that behaves the same in development and production.
* **Fast developer loop:** Implement your model with fast feedback from a live reload server, and skip Docker and Kubernetes configuration with a batteries-included model serving environment.
* **Support for all Python frameworks**: From `transformers` and `diffusers` to `PyTorch` and `TensorFlow` to `TensorRT` and `Triton`, Truss supports models created and served with any framework.

See Trusses for popular models including:

* 🦙 [Llama 2 7B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-7b-chat) ([13B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-13b-chat)) ([70B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-70b-chat))
* 🎨 [Stable Diffusion XL](https://github.com/basetenlabs/truss-examples/tree/main/stable-diffusion/stable-diffusion-xl-1.0)
* 🗣 [Whisper](https://github.com/basetenlabs/truss-examples/tree/main/whisper/whisper-truss)

and [dozens more examples](https://github.com/basetenlabs/truss-examples/).

## Installation

Install Truss with:

```
pip install --upgrade truss
```

## Quickstart

As a quick example, we'll package a [text classification pipeline](https://huggingface.co/docs/transformers/main_classes/pipelines) from the open-source [`transformers` package](https://github.com/huggingface/transformers).

### Create a Truss

To get started, create a Truss with the following terminal command:

```sh
truss init text-classification
```

When prompted, give your Truss a name like `Text classification`.

Then, navigate to the newly created directory:

```sh
cd text-classification
```

### Implement the model

One of the two essential files in a Truss is `model/model.py`. In this file, you write a `Model` class: an interface between the ML model that you're packaging and the model server that you're running it on.

There are two member functions that you must implement in the `Model` class:

* `load()` loads the model onto the model server. It runs exactly once when the model server is spun up or patched.
* `predict()` handles model inference. It runs every time the model server is called.

Here's the complete `model/model.py` for the text classification model:

```python
from transformers import pipeline


class Model:
    def __init__(self, **kwargs):
        self._model = None

    def load(self):
        self._model = pipeline("text-classification")

    def predict(self, model_input):
        return self._model(model_input)
```

### Add model dependencies

The other essential file in a Truss is `config.yaml`, which configures the model serving environment. For a complete list of the config options, see [the config reference](https://truss.baseten.co/reference/config).

The pipeline model relies on [Transformers](https://huggingface.co/docs/transformers/index) and [PyTorch](https://pytorch.org/). These dependencies must be specified in the Truss config.

In `config.yaml`, find the line `requirements`. Replace the empty list with:

```yaml
requirements:
  - torch==2.0.1
  - transformers==4.30.0
```

No other configuration is needed.

## Deployment

Truss is maintained by [Baseten](https://baseten.co), which provides infrastructure for running ML models in production. We'll use Baseten as the remote host for your model.

Other remotes are coming soon, starting with AWS SageMaker.

### Get an API key

To set up the Baseten remote, you'll need a [Baseten API key](https://app.baseten.co/settings/account/api_keys). If you don't have a Baseten account, no worries, just [sign up for an account](https://app.baseten.co/signup/) and you'll be issued plenty of free credits to get you started.

### Run `truss push`

With your Baseten API key ready to paste when prompted, you can deploy your model:

```sh
truss push
```

You can monitor your model deployment from [your model dashboard on Baseten](https://app.baseten.co/models/).

### Invoke the model

After the model has finished deploying, you can invoke it from the terminal.

**Invocation**

```sh
truss predict -d '"Truss is awesome!"'
```

**Response**

```json
[
  {
    "label": "POSITIVE",
    "score": 0.999873161315918
  }
]
```

## Truss contributors

Truss is backed by Baseten and built in collaboration with ML engineers worldwide. Special thanks to [Stephan Auerhahn](https://github.com/palp) @ [stability.ai](https://stability.ai/) and [Daniel Sarfati](https://github.com/dsarfati) @ [Salad Technologies](https://salad.com/) for their contributions.

We enthusiastically welcome contributions in accordance with our [contributors' guide](CONTRIBUTING.md) and [code of conduct](CODE_OF_CONDUCT.md).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "truss",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.9",
    "maintainer_email": null,
    "keywords": "AI, MLOps, Machine Learning, Model Deployment, Model Serving",
    "author": null,
    "author_email": "Pankaj Gupta <no-reply@baseten.co>, Phil Howes <no-reply@baseten.co>",
    "download_url": "https://files.pythonhosted.org/packages/09/18/021c9ce38d8d1c624a42c8768b1b27ebdb425811f240922bee903eac4c7e/truss-0.11.1.tar.gz",
    "platform": null,
    "description": "# Truss\n\n**The simplest way to serve AI/ML models in production**\n\n[![PyPI version](https://badge.fury.io/py/truss.svg)](https://badge.fury.io/py/truss)\n[![ci_status](https://github.com/basetenlabs/truss/actions/workflows/release.yml/badge.svg)](https://github.com/basetenlabs/truss/actions/workflows/release.yml)\n\n## Why Truss?\n\n* **Write once, run anywhere:** Package and test model code, weights, and dependencies with a model server that behaves the same in development and production.\n* **Fast developer loop:** Implement your model with fast feedback from a live reload server, and skip Docker and Kubernetes configuration with a batteries-included model serving environment.\n* **Support for all Python frameworks**: From `transformers` and `diffusers` to `PyTorch` and `TensorFlow` to `TensorRT` and `Triton`, Truss supports models created and served with any framework.\n\nSee Trusses for popular models including:\n\n* \ud83e\udd99 [Llama 2 7B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-7b-chat) ([13B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-13b-chat)) ([70B](https://github.com/basetenlabs/truss-examples/tree/main/llama/llama-2-70b-chat))\n* \ud83c\udfa8 [Stable Diffusion XL](https://github.com/basetenlabs/truss-examples/tree/main/stable-diffusion/stable-diffusion-xl-1.0)\n* \ud83d\udde3 [Whisper](https://github.com/basetenlabs/truss-examples/tree/main/whisper/whisper-truss)\n\nand [dozens more examples](https://github.com/basetenlabs/truss-examples/).\n\n## Installation\n\nInstall Truss with:\n\n```\npip install --upgrade truss\n```\n\n## Quickstart\n\nAs a quick example, we'll package a [text classification pipeline](https://huggingface.co/docs/transformers/main_classes/pipelines) from the open-source [`transformers` package](https://github.com/huggingface/transformers).\n\n### Create a Truss\n\nTo get started, create a Truss with the following terminal command:\n\n```sh\ntruss init text-classification\n```\n\nWhen prompted, give your Truss a name like `Text classification`.\n\nThen, navigate to the newly created directory:\n\n```sh\ncd text-classification\n```\n\n### Implement the model\n\nOne of the two essential files in a Truss is `model/model.py`. In this file, you write a `Model` class: an interface between the ML model that you're packaging and the model server that you're running it on.\n\nThere are two member functions that you must implement in the `Model` class:\n\n* `load()` loads the model onto the model server. It runs exactly once when the model server is spun up or patched.\n* `predict()` handles model inference. It runs every time the model server is called.\n\nHere's the complete `model/model.py` for the text classification model:\n\n```python\nfrom transformers import pipeline\n\n\nclass Model:\n    def __init__(self, **kwargs):\n        self._model = None\n\n    def load(self):\n        self._model = pipeline(\"text-classification\")\n\n    def predict(self, model_input):\n        return self._model(model_input)\n```\n\n### Add model dependencies\n\nThe other essential file in a Truss is `config.yaml`, which configures the model serving environment. For a complete list of the config options, see [the config reference](https://truss.baseten.co/reference/config).\n\nThe pipeline model relies on [Transformers](https://huggingface.co/docs/transformers/index) and [PyTorch](https://pytorch.org/). These dependencies must be specified in the Truss config.\n\nIn `config.yaml`, find the line `requirements`. Replace the empty list with:\n\n```yaml\nrequirements:\n  - torch==2.0.1\n  - transformers==4.30.0\n```\n\nNo other configuration is needed.\n\n## Deployment\n\nTruss is maintained by [Baseten](https://baseten.co), which provides infrastructure for running ML models in production. We'll use Baseten as the remote host for your model.\n\nOther remotes are coming soon, starting with AWS SageMaker.\n\n### Get an API key\n\nTo set up the Baseten remote, you'll need a [Baseten API key](https://app.baseten.co/settings/account/api_keys). If you don't have a Baseten account, no worries, just [sign up for an account](https://app.baseten.co/signup/) and you'll be issued plenty of free credits to get you started.\n\n### Run `truss push`\n\nWith your Baseten API key ready to paste when prompted, you can deploy your model:\n\n```sh\ntruss push\n```\n\nYou can monitor your model deployment from [your model dashboard on Baseten](https://app.baseten.co/models/).\n\n### Invoke the model\n\nAfter the model has finished deploying, you can invoke it from the terminal.\n\n**Invocation**\n\n```sh\ntruss predict -d '\"Truss is awesome!\"'\n```\n\n**Response**\n\n```json\n[\n  {\n    \"label\": \"POSITIVE\",\n    \"score\": 0.999873161315918\n  }\n]\n```\n\n## Truss contributors\n\nTruss is backed by Baseten and built in collaboration with ML engineers worldwide. Special thanks to [Stephan Auerhahn](https://github.com/palp) @ [stability.ai](https://stability.ai/) and [Daniel Sarfati](https://github.com/dsarfati) @ [Salad Technologies](https://salad.com/) for their contributions.\n\nWe enthusiastically welcome contributions in accordance with our [contributors' guide](CONTRIBUTING.md) and [code of conduct](CODE_OF_CONDUCT.md).\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A seamless bridge from model development to model delivery",
    "version": "0.11.1",
    "project_urls": {
        "Baseten": "https://baseten.co",
        "Bug Reports": "https://github.com/basetenlabs/truss/issues",
        "Documentation": "https://truss.baseten.co",
        "Homepage": "https://truss.baseten.co",
        "Repository": "https://github.com/basetenlabs/truss"
    },
    "split_keywords": [
        "ai",
        " mlops",
        " machine learning",
        " model deployment",
        " model serving"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d2448efbc77b48128ecc221d56f52b48ca24fdd88f20c85d8497bea65fe88017",
                "md5": "7048f18fbc4839fdd45363468e0be328",
                "sha256": "503954459e82a8818a8aab43d87cd4b72d7c92f628c0a53e9b7afddc96862d88"
            },
            "downloads": -1,
            "filename": "truss-0.11.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7048f18fbc4839fdd45363468e0be328",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.9",
            "size": 491878,
            "upload_time": "2025-09-10T18:55:38",
            "upload_time_iso_8601": "2025-09-10T18:55:38.337404Z",
            "url": "https://files.pythonhosted.org/packages/d2/44/8efbc77b48128ecc221d56f52b48ca24fdd88f20c85d8497bea65fe88017/truss-0.11.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0918021c9ce38d8d1c624a42c8768b1b27ebdb425811f240922bee903eac4c7e",
                "md5": "32570d6b8f30c1ed5f73062ad536b6df",
                "sha256": "b79714470251ce7f53df8dd3ecaef7f553605bdf3d047a1966db2c4c760ef295"
            },
            "downloads": -1,
            "filename": "truss-0.11.1.tar.gz",
            "has_sig": false,
            "md5_digest": "32570d6b8f30c1ed5f73062ad536b6df",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.9",
            "size": 384441,
            "upload_time": "2025-09-10T18:55:39",
            "upload_time_iso_8601": "2025-09-10T18:55:39.981866Z",
            "url": "https://files.pythonhosted.org/packages/09/18/021c9ce38d8d1c624a42c8768b1b27ebdb425811f240922bee903eac4c7e/truss-0.11.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-10 18:55:39",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "basetenlabs",
    "github_project": "truss",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "truss"
}
        
Elapsed time: 3.22704s