mlserver-xgboost


Namemlserver-xgboost JSON
Version 1.6.1 PyPI version JSON
download
home_pageNone
SummaryXGBoost runtime for MLServer
upload_time2024-09-10 15:10:53
maintainerNone
docs_urlNone
authorSeldon Technologies Ltd.
requires_python<3.12,>=3.9
licenseApache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # XGBoost runtime for MLServer

This package provides a MLServer runtime compatible with XGBoost.

## Usage

You can install the runtime, alongside `mlserver`, as:

```bash
pip install mlserver mlserver-xgboost
```

For further information on how to use MLServer with XGBoost, you can check out
this [worked out example](../../docs/examples/xgboost/README.md).

## XGBoost Artifact Type

The XGBoost inference runtime will expect that your model is serialised via one
of the following methods:

| Extension | Docs                                                                                                                 | Example                            |
| --------- | -------------------------------------------------------------------------------------------------------------------- | ---------------------------------- |
| `*.json`  | [JSON Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io)         | `booster.save_model("model.json")` |
| `*.ubj`   | [Binary JSON Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io)  | `booster.save_model("model.ubj")`  |
| `*.bst`   | [(Old) Binary Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io) | `booster.save_model("model.bst")`  |

````{note}
By default, the runtime will look for a file called `model.[json | ubj | bst]`.
However, this can be modified through the `parameters.uri` field of your
{class}`ModelSettings <mlserver.settings.ModelSettings>` config (see the
section on [Model Settings](../../docs/reference/model-settings.md) for more
details).

```{code-block} json
---
emphasize-lines: 3-5
---
{
  "name": "foo",
  "parameters": {
    "uri": "./my-own-model-filename.json"
  }
}
```
````

## Content Types

If no [content type](../../docs/user-guide/content-type) is present on the
request or metadata, the XGBoost runtime will try to decode the payload as a
[NumPy Array](../../docs/user-guide/content-type).
To avoid this, either send a different content type explicitly, or define the
correct one as part of your [model's
metadata](../../docs/reference/model-settings).

## Model Outputs

The XGBoost inference runtime exposes a number of outputs depending on the
model type.
These outputs match to the `predict` and `predict_proba` methods of the XGBoost
model.

| Output          | Returned By Default | Availability                                                          |
| --------------- | ------------------- | --------------------------------------------------------------------- |
| `predict`       | ✅                  | Available on all XGBoost models.                                      |
| `predict_proba` | ❌                  | Only available on non-regressor models (i.e. `XGBClassifier` models). |

By default, the runtime will only return the output of `predict`.
However, you are able to control which outputs you want back through the
`outputs` field of your {class}`InferenceRequest
<mlserver.types.InferenceRequest>` payload.

For example, to only return the model's `predict_proba` output, you could
define a payload such as:

```{code-block} json
---
emphasize-lines: 10-12
---
{
  "inputs": [
    {
      "name": "my-input",
      "datatype": "INT32",
      "shape": [2, 2],
      "data": [1, 2, 3, 4]
    }
  ],
  "outputs": [
    { "name": "predict_proba" }
  ]
}
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mlserver-xgboost",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.12,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Seldon Technologies Ltd.",
    "author_email": "hello@seldon.io",
    "download_url": "https://files.pythonhosted.org/packages/34/43/7741dddb5babacce5f2e1638cabed3b8009dfc492757c18ccbe2f35adb46/mlserver_xgboost-1.6.1.tar.gz",
    "platform": null,
    "description": "# XGBoost runtime for MLServer\n\nThis package provides a MLServer runtime compatible with XGBoost.\n\n## Usage\n\nYou can install the runtime, alongside `mlserver`, as:\n\n```bash\npip install mlserver mlserver-xgboost\n```\n\nFor further information on how to use MLServer with XGBoost, you can check out\nthis [worked out example](../../docs/examples/xgboost/README.md).\n\n## XGBoost Artifact Type\n\nThe XGBoost inference runtime will expect that your model is serialised via one\nof the following methods:\n\n| Extension | Docs                                                                                                                 | Example                            |\n| --------- | -------------------------------------------------------------------------------------------------------------------- | ---------------------------------- |\n| `*.json`  | [JSON Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io)         | `booster.save_model(\"model.json\")` |\n| `*.ubj`   | [Binary JSON Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io)  | `booster.save_model(\"model.ubj\")`  |\n| `*.bst`   | [(Old) Binary Format](https://xgboost.readthedocs.io/en/stable/tutorials/saving_model.html#introduction-to-model-io) | `booster.save_model(\"model.bst\")`  |\n\n````{note}\nBy default, the runtime will look for a file called `model.[json | ubj | bst]`.\nHowever, this can be modified through the `parameters.uri` field of your\n{class}`ModelSettings <mlserver.settings.ModelSettings>` config (see the\nsection on [Model Settings](../../docs/reference/model-settings.md) for more\ndetails).\n\n```{code-block} json\n---\nemphasize-lines: 3-5\n---\n{\n  \"name\": \"foo\",\n  \"parameters\": {\n    \"uri\": \"./my-own-model-filename.json\"\n  }\n}\n```\n````\n\n## Content Types\n\nIf no [content type](../../docs/user-guide/content-type) is present on the\nrequest or metadata, the XGBoost runtime will try to decode the payload as a\n[NumPy Array](../../docs/user-guide/content-type).\nTo avoid this, either send a different content type explicitly, or define the\ncorrect one as part of your [model's\nmetadata](../../docs/reference/model-settings).\n\n## Model Outputs\n\nThe XGBoost inference runtime exposes a number of outputs depending on the\nmodel type.\nThese outputs match to the `predict` and `predict_proba` methods of the XGBoost\nmodel.\n\n| Output          | Returned By Default | Availability                                                          |\n| --------------- | ------------------- | --------------------------------------------------------------------- |\n| `predict`       | \u2705                  | Available on all XGBoost models.                                      |\n| `predict_proba` | \u274c                  | Only available on non-regressor models (i.e. `XGBClassifier` models). |\n\nBy default, the runtime will only return the output of `predict`.\nHowever, you are able to control which outputs you want back through the\n`outputs` field of your {class}`InferenceRequest\n<mlserver.types.InferenceRequest>` payload.\n\nFor example, to only return the model's `predict_proba` output, you could\ndefine a payload such as:\n\n```{code-block} json\n---\nemphasize-lines: 10-12\n---\n{\n  \"inputs\": [\n    {\n      \"name\": \"my-input\",\n      \"datatype\": \"INT32\",\n      \"shape\": [2, 2],\n      \"data\": [1, 2, 3, 4]\n    }\n  ],\n  \"outputs\": [\n    { \"name\": \"predict_proba\" }\n  ]\n}\n```\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "XGBoost runtime for MLServer",
    "version": "1.6.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "111f8eeab40df4120bdd36da45a0a067358e8ecdf8ac6dbd598843ffbbd47096",
                "md5": "6fb6079d2e18fa624b09a1f0b12a6239",
                "sha256": "d917d1dfbd80e76cb26fb7c1a84ee4489a09c60e434283c7f000191e37b13474"
            },
            "downloads": -1,
            "filename": "mlserver_xgboost-1.6.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6fb6079d2e18fa624b09a1f0b12a6239",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.12,>=3.9",
            "size": 8042,
            "upload_time": "2024-09-10T15:10:51",
            "upload_time_iso_8601": "2024-09-10T15:10:51.497477Z",
            "url": "https://files.pythonhosted.org/packages/11/1f/8eeab40df4120bdd36da45a0a067358e8ecdf8ac6dbd598843ffbbd47096/mlserver_xgboost-1.6.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "34437741dddb5babacce5f2e1638cabed3b8009dfc492757c18ccbe2f35adb46",
                "md5": "1cf578796c5961a6846df019084697c0",
                "sha256": "69717157deee2e302c9d7ec6712437f6c4f687faaea53f8126b8fe3daee549d9"
            },
            "downloads": -1,
            "filename": "mlserver_xgboost-1.6.1.tar.gz",
            "has_sig": false,
            "md5_digest": "1cf578796c5961a6846df019084697c0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.12,>=3.9",
            "size": 7072,
            "upload_time": "2024-09-10T15:10:53",
            "upload_time_iso_8601": "2024-09-10T15:10:53.591053Z",
            "url": "https://files.pythonhosted.org/packages/34/43/7741dddb5babacce5f2e1638cabed3b8009dfc492757c18ccbe2f35adb46/mlserver_xgboost-1.6.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-10 15:10:53",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "mlserver-xgboost"
}
        
Elapsed time: 1.26326s