truss


Nametruss JSON
Version 0.3.0 PyPI version JSON
download
home_pagehttps://github.com/basetenlabs/truss
SummaryA seamless bridge from model development to model delivery
upload_time2023-02-01 00:29:48
maintainer
docs_urlNone
authorPankaj Gupta
requires_python>=3.8,<3.11
licenseMIT
keywords mlops ai model serving model deployment machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Truss

**Serve any model without boilerplate code**

![Truss logo](https://raw.githubusercontent.com/basetenlabs/truss/main/docs/assets/truss_logo_horizontal.png)

[![PyPI version](https://badge.fury.io/py/truss.svg)](https://badge.fury.io/py/truss)
[![ci_status](https://github.com/basetenlabs/truss/actions/workflows/main.yml/badge.svg)](https://github.com/basetenlabs/truss/actions/workflows/main.yml)

Meet Truss, a seamless bridge from model development to model delivery. Truss presents an open-source standard for packaging models built in any framework for sharing and deployment in any environment, local or production.

Get started with the [end-to-end tutorial](https://truss.baseten.co/e2e).

## What can I do with Truss?

If you've ever tried to get a model out of a Jupyter notebook, Truss is for you.

Truss exposes just the right amount of complexity around things like Docker and APIs without you really having to think about them. Here are some of the things Truss does:

* 🏎 Turns your Python model into a microservice with a production-ready API endpoint, no need for Flask or Django.
* 🎚 For most popular frameworks, includes automatic model serialization and deserialization.
* 🛍 Freezes dependencies via Docker to make your training environment portable.
* 🕰 Enables rapid iteration with local development that matches your production environment.
* 🗃 Encourages shipping parsing and even business logic alongside your model with integrated pre- and post-processing functions.
* 🤖 Supports running predictions on GPUs. (Currently limited to certain hardware, more coming soon)
* 🙉 Bundles secret management to securely give your model access to API keys.

## Installation

Truss requires Python >=3.7, <3.11

To install from [PyPi](https://pypi.org/project/truss/), run:

```
pip install truss
```

To download the source code directly (for development), clone this repository and follow the setup commands in our [contributors' guide](CONTRIBUTING.md).

Truss is actively developed, and we recommend using the latest version. To update your Truss installation, run:

```
pip install --upgrade truss
```

Though Truss is in beta, we do care about backward compatibility. Review the [release notes](docs/CHANGELOG.md) before upgrading, and note that we follow semantic versioning, so any breaking changes require the release of a new major version.

## How to use Truss

Generate and serve predictions from a Truss with [this Jupyter notebook](docs/notebooks/sklearn_example.ipynb).

### Quickstart: making a Truss

```python
!pip install --upgrade scikit-learn truss

import truss
from sklearn.ensemble import RandomForestClassifier
from sklearn.datasets import load_iris

# Load the iris data set
iris = load_iris()
data_x = iris['data']
data_y = iris['target']

# Train the model
rfc = RandomForestClassifier()
rfc.fit(data_x, data_y)

# Create the Truss (serializing & packaging model)
tr = truss.create(rfc, target_directory="iris_rfc_truss")

# Serve a prediction from the model
tr.predict({"inputs": [[0, 0, 0, 0]]})
```

### Package your model

The `truss.create()` command can be used with any supported framework:

* [Hugging Face](https://truss.baseten.co/create/huggingface)
* [LightGBM](https://truss.baseten.co/create/lightgbm)
* [PyTorch](https://truss.baseten.co/create/pytorch)
* [scikit-learn](https://truss.baseten.co/create/sklearn)
* [Tensorflow](https://truss.baseten.co/create/tensorflow)
* [XGBoost](https://truss.baseten.co/create/xgboost)

But in more complex cases, you can build a Truss manually for any model. Start with `truss init my_truss` and follow [this guide](https://truss.baseten.co/create/manual).

### Serve your model locally

Serving your model with Truss, on Docker, lets you interface with your model via HTTP requests. Start your model server with:

```
truss run-image iris_rfc_truss
```

Then, as long as the container is running, you can invoke the model as an API as follows:

```
curl -X POST http://127.0.0.1:8080/v1/models/model:predict -d '{"inputs": [[0, 0, 0, 0]]}'
```

### Configure your model for deployment

Truss is configurable to its core. Every Truss must include a file `config.yaml` in its root directory, which is automatically generated when the Truss is created. However, configuration is optional. Every configurable value has a sensible default, and a completely empty config file is valid.

The Truss we generated above in the quickstart sample has a good example of a typical Truss config:

```yaml
model_framework: sklearn
model_metadata:
  model_binary_dir: model
  supports_predict_proba: true
python_version: py39
requirements:
- scikit-learn==1.0.2
- threadpoolctl==3.0.0
- joblib==1.1.0
- numpy==1.20.3
- scipy==1.7.3
```

Follow the [configuration guide](https://truss.baseten.co/develop/configuration) and use the complete reference of configurable properties to make your Truss perform exactly as you wish.

### Deploy your model

You can deploy a Truss anywhere that can run a Docker image, as well as purpose-built platforms like [Baseten](https://baseten.co).

Follow step-by-step deployment guides for the following platforms:

* [AWS ECS](https://truss.baseten.co/deploy/aws)
* [Baseten](https://truss.baseten.co/deploy/baseten)
* [GCP Cloud Run](https://truss.baseten.co/deploy/gcp)

## Contributing

We hope this vision excites you, and we gratefully welcome contributions in accordance with our [contributors' guide](CONTRIBUTING.md) and [code of conduct](CODE_OF_CONDUCT.md).

Truss was first developed at [Baseten](https://baseten.co) by maintainers Phil Howes, Pankaj Gupta, and Alex Gillmor.

## GitHub Codespace

If your organization allows to access to GitHub Codespaces, you can launch a Codespace for truss development. If you are a GPU Codespace, make sure to use the `.devcontainer/gpu/devcontainer.json` configuration to have access to a GPU and be able to use it in Docker with truss.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/basetenlabs/truss",
    "name": "truss",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8,<3.11",
    "maintainer_email": "",
    "keywords": "MLOps,AI,Model Serving,Model Deployment,Machine Learning",
    "author": "Pankaj Gupta",
    "author_email": "pankaj@baseten.co",
    "download_url": "https://files.pythonhosted.org/packages/80/8c/11c13e3bbcb41a0de18f7c7998d77ad015cdf402cb03742ad9ac6c224acd/truss-0.3.0.tar.gz",
    "platform": null,
    "description": "# Truss\n\n**Serve any model without boilerplate code**\n\n![Truss logo](https://raw.githubusercontent.com/basetenlabs/truss/main/docs/assets/truss_logo_horizontal.png)\n\n[![PyPI version](https://badge.fury.io/py/truss.svg)](https://badge.fury.io/py/truss)\n[![ci_status](https://github.com/basetenlabs/truss/actions/workflows/main.yml/badge.svg)](https://github.com/basetenlabs/truss/actions/workflows/main.yml)\n\nMeet Truss, a seamless bridge from model development to model delivery. Truss presents an open-source standard for packaging models built in any framework for sharing and deployment in any environment, local or production.\n\nGet started with the [end-to-end tutorial](https://truss.baseten.co/e2e).\n\n## What can I do with Truss?\n\nIf you've ever tried to get a model out of a Jupyter notebook, Truss is for you.\n\nTruss exposes just the right amount of complexity around things like Docker and APIs without you really having to think about them. Here are some of the things Truss does:\n\n* \ud83c\udfce Turns your Python model into a microservice with a production-ready API endpoint, no need for Flask or Django.\n* \ud83c\udf9a For most popular frameworks, includes automatic model serialization and deserialization.\n* \ud83d\udecd Freezes dependencies via Docker to make your training environment portable.\n* \ud83d\udd70 Enables rapid iteration with local development that matches your production environment.\n* \ud83d\uddc3 Encourages shipping parsing and even business logic alongside your model with integrated pre- and post-processing functions.\n* \ud83e\udd16 Supports running predictions on GPUs. (Currently limited to certain hardware, more coming soon)\n* \ud83d\ude49 Bundles secret management to securely give your model access to API keys.\n\n## Installation\n\nTruss requires Python >=3.7, <3.11\n\nTo install from [PyPi](https://pypi.org/project/truss/), run:\n\n```\npip install truss\n```\n\nTo download the source code directly (for development), clone this repository and follow the setup commands in our [contributors' guide](CONTRIBUTING.md).\n\nTruss is actively developed, and we recommend using the latest version. To update your Truss installation, run:\n\n```\npip install --upgrade truss\n```\n\nThough Truss is in beta, we do care about backward compatibility. Review the [release notes](docs/CHANGELOG.md) before upgrading, and note that we follow semantic versioning, so any breaking changes require the release of a new major version.\n\n## How to use Truss\n\nGenerate and serve predictions from a Truss with [this Jupyter notebook](docs/notebooks/sklearn_example.ipynb).\n\n### Quickstart: making a Truss\n\n```python\n!pip install --upgrade scikit-learn truss\n\nimport truss\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.datasets import load_iris\n\n# Load the iris data set\niris = load_iris()\ndata_x = iris['data']\ndata_y = iris['target']\n\n# Train the model\nrfc = RandomForestClassifier()\nrfc.fit(data_x, data_y)\n\n# Create the Truss (serializing & packaging model)\ntr = truss.create(rfc, target_directory=\"iris_rfc_truss\")\n\n# Serve a prediction from the model\ntr.predict({\"inputs\": [[0, 0, 0, 0]]})\n```\n\n### Package your model\n\nThe `truss.create()` command can be used with any supported framework:\n\n* [Hugging Face](https://truss.baseten.co/create/huggingface)\n* [LightGBM](https://truss.baseten.co/create/lightgbm)\n* [PyTorch](https://truss.baseten.co/create/pytorch)\n* [scikit-learn](https://truss.baseten.co/create/sklearn)\n* [Tensorflow](https://truss.baseten.co/create/tensorflow)\n* [XGBoost](https://truss.baseten.co/create/xgboost)\n\nBut in more complex cases, you can build a Truss manually for any model. Start with `truss init my_truss` and follow [this guide](https://truss.baseten.co/create/manual).\n\n### Serve your model locally\n\nServing your model with Truss, on Docker, lets you interface with your model via HTTP requests. Start your model server with:\n\n```\ntruss run-image iris_rfc_truss\n```\n\nThen, as long as the container is running, you can invoke the model as an API as follows:\n\n```\ncurl -X POST http://127.0.0.1:8080/v1/models/model:predict -d '{\"inputs\": [[0, 0, 0, 0]]}'\n```\n\n### Configure your model for deployment\n\nTruss is configurable to its core. Every Truss must include a file `config.yaml` in its root directory, which is automatically generated when the Truss is created. However, configuration is optional. Every configurable value has a sensible default, and a completely empty config file is valid.\n\nThe Truss we generated above in the quickstart sample has a good example of a typical Truss config:\n\n```yaml\nmodel_framework: sklearn\nmodel_metadata:\n  model_binary_dir: model\n  supports_predict_proba: true\npython_version: py39\nrequirements:\n- scikit-learn==1.0.2\n- threadpoolctl==3.0.0\n- joblib==1.1.0\n- numpy==1.20.3\n- scipy==1.7.3\n```\n\nFollow the [configuration guide](https://truss.baseten.co/develop/configuration) and use the complete reference of configurable properties to make your Truss perform exactly as you wish.\n\n### Deploy your model\n\nYou can deploy a Truss anywhere that can run a Docker image, as well as purpose-built platforms like [Baseten](https://baseten.co).\n\nFollow step-by-step deployment guides for the following platforms:\n\n* [AWS ECS](https://truss.baseten.co/deploy/aws)\n* [Baseten](https://truss.baseten.co/deploy/baseten)\n* [GCP Cloud Run](https://truss.baseten.co/deploy/gcp)\n\n## Contributing\n\nWe hope this vision excites you, and we gratefully welcome contributions in accordance with our [contributors' guide](CONTRIBUTING.md) and [code of conduct](CODE_OF_CONDUCT.md).\n\nTruss was first developed at [Baseten](https://baseten.co) by maintainers Phil Howes, Pankaj Gupta, and Alex Gillmor.\n\n## GitHub Codespace\n\nIf your organization allows to access to GitHub Codespaces, you can launch a Codespace for truss development. If you are a GPU Codespace, make sure to use the `.devcontainer/gpu/devcontainer.json` configuration to have access to a GPU and be able to use it in Docker with truss.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A seamless bridge from model development to model delivery",
    "version": "0.3.0",
    "split_keywords": [
        "mlops",
        "ai",
        "model serving",
        "model deployment",
        "machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bd5dba6aa64ccc3b980fba520976b934957cde4eae8e9fc36e7f37b10722f573",
                "md5": "86c8c2b80a3a099ce4f6c6a4cef55d9c",
                "sha256": "51d201a2346fb7c915ad3cacb19ad4140b6a4534c292727f33a6d1f3d2ad060a"
            },
            "downloads": -1,
            "filename": "truss-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "86c8c2b80a3a099ce4f6c6a4cef55d9c",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8,<3.11",
            "size": 171049,
            "upload_time": "2023-02-01T00:29:45",
            "upload_time_iso_8601": "2023-02-01T00:29:45.750180Z",
            "url": "https://files.pythonhosted.org/packages/bd/5d/ba6aa64ccc3b980fba520976b934957cde4eae8e9fc36e7f37b10722f573/truss-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "808c11c13e3bbcb41a0de18f7c7998d77ad015cdf402cb03742ad9ac6c224acd",
                "md5": "552281241fdd31f9e46767bb1f991c09",
                "sha256": "c3b911bf6852a6ddf4488b4354b6f12fbc1c1cdef5d9e7a8b7447531456c90b9"
            },
            "downloads": -1,
            "filename": "truss-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "552281241fdd31f9e46767bb1f991c09",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8,<3.11",
            "size": 120675,
            "upload_time": "2023-02-01T00:29:48",
            "upload_time_iso_8601": "2023-02-01T00:29:48.394553Z",
            "url": "https://files.pythonhosted.org/packages/80/8c/11c13e3bbcb41a0de18f7c7998d77ad015cdf402cb03742ad9ac6c224acd/truss-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-02-01 00:29:48",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "basetenlabs",
    "github_project": "truss",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "truss"
}
        
Elapsed time: 0.11873s