tensorzero


Nametensorzero JSON
Version 2025.7.0 PyPI version JSON
download
home_pageNone
SummaryThe Python client for TensorZero
upload_time2025-07-10 21:39:03
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseApache-2.0
keywords artificial intelligence ai machine learning ml large language model large language models llm llms natural language processing nlp generative ai genai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TensorZero Python Client

**[Website](https://www.tensorzero.com/)** ·
**[Docs](https://www.tensorzero.com/docs)** ·
**[Twitter](https://www.x.com/tensorzero)** ·
**[Slack](https://www.tensorzero.com/slack)** ·
**[Discord](https://www.tensorzero.com/discord)**

**[Quick Start (5min)](https://www.tensorzero.com/docs/quickstart)** ·
**[Comprehensive Tutorial](https://www.tensorzero.com/docs/gateway/tutorial)** ·
**[Deployment Guide](https://www.tensorzero.com/docs/gateway/deployment)** ·
**[API Reference](https://www.tensorzero.com/docs/gateway/api-reference/inference)** ·
**[Configuration Reference](https://www.tensorzero.com/docs/gateway/configuration-reference)**

The `tensorzero` package provides a Python client for the TensorZero Gateway.
This client allows you to easily make inference requests and assign feedback to them via the gateway.

See our **[API Reference](https://www.tensorzero.com/docs/gateway/api-reference)** for more information.

## Installation

```bash
pip install tensorzero
```

## Basic Usage

### Initialization

The TensorZero client offers synchronous (`TensorZeroGateway`) and asynchronous (`AsyncTensorZeroGateway`) variants.
Additionally, the client can launch an embedded (in-memory) gateway (`build_embedded`) or connect to an external HTTP gateway (`build_http`) - both of these methods return a gateway instance.

By default, the asynchronous client returns a `Future` when you call `build_http` or `build_embedded`, so you must `await` it.
If you prefer to avoid the `await`, you can set `async_setup=False` to initialize the client in a blocking way.

#### Synchronous HTTP Gateway

```python
from tensorzero import TensorZeroGateway

with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    # ...
```

#### Asynchronous HTTP Gateway

```python
import asyncio

from tensorzero import AsyncTensorZeroGateway


async def run():
    async with await AsyncTensorZeroGateway.build_http(
        gateway_url="http://localhost:3000",
        # async_setup=False  # optional: skip the `await` and run `build_http` synchronously (blocking)
    ) as client:
        # ...


if __name__ == "__main__":
    asyncio.run(run())
```

#### Synchronous Embedded Gateway

```python
from tensorzero import TensorZeroGateway

with TensorZeroGateway.build_embedded(
    config_file="/path/to/tensorzero.toml",
    clickhouse_url="http://chuser:chpassword@localhost:8123/tensorzero"
) as client:
    # ...
```

#### Asynchronous Embedded Gateway

```python
import asyncio

from tensorzero import AsyncTensorZeroGateway


async def run():
    async with await AsyncTensorZeroGateway.build_embedded(
        config_file="/path/to/tensorzero.toml",
        clickhouse_url="http://chuser:chpassword@localhost:8123/tensorzero"
        # async_setup=False  # optional: skip the `await` and run `build_embedded` synchronously (blocking)
    ) as client:
        # ...


if __name__ == "__main__":
    asyncio.run(run())
```

### Inference

#### Non-Streaming Inference with Synchronous Client

```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    response = client.inference(
        model_name="openai::gpt-4o-mini",
        input={
            "messages": [
                {"role": "user", "content": "What is the capital of Japan?"},
            ],
        },
    )

    print(response)
```

#### Non-Streaming Inference with Asynchronous Client

```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    response = await client.inference(
        model_name="openai::gpt-4o-mini",
        input={
            "messages": [
                {"role": "user", "content": "What is the capital of Japan?"},
            ],
        },
    )

    print(response)
```

#### Streaming Inference with Synchronous Client

```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    stream = client.inference(
        model_name="openai::gpt-4o-mini",
        input={
            "messages": [
                {"role": "user", "content": "What is the capital of Japan?"},
            ],
        },
        stream=True,
    )

    for chunk in stream:
        print(chunk)
```

#### Streaming Inference with Asynchronous Client

```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    stream = await client.inference(
        model_name="openai::gpt-4o-mini",
        input={
            "messages": [{"role": "user", "content": "What is the capital of Japan?"}],
        },
        stream=True,
    )

    async for chunk in stream:
        print(chunk)
```

### Feedback

#### Synchronous

```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    response = client.feedback(
        metric_name="thumbs_up",
        inference_id="00000000-0000-0000-0000-000000000000",
        value=True,  # 👍
    )

    print(response)
```

#### Asynchronous

```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
    response = await client.feedback(
        metric_name="thumbs_up",
        inference_id="00000000-0000-0000-0000-000000000000",
        value=True,  # 👍
    )

    print(response)
```


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "tensorzero",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "artificial intelligence, ai, machine learning, ml, large language model, large language models, llm, llms, natural language processing, NLP, generative ai, genai",
    "author": null,
    "author_email": "Viraj Mehta <viraj@tensorzero.com>, Gabriel Bianconi <gabriel@tensorzero.com>, Aaron Hill <aaron@tensorzero.com>",
    "download_url": "https://files.pythonhosted.org/packages/78/31/94705468a0ba8f48fa5d64b1825c461ab404ac4e30feed649de93d0abc3f/tensorzero-2025.7.0.tar.gz",
    "platform": null,
    "description": "# TensorZero Python Client\n\n**[Website](https://www.tensorzero.com/)** \u00b7\n**[Docs](https://www.tensorzero.com/docs)** \u00b7\n**[Twitter](https://www.x.com/tensorzero)** \u00b7\n**[Slack](https://www.tensorzero.com/slack)** \u00b7\n**[Discord](https://www.tensorzero.com/discord)**\n\n**[Quick Start (5min)](https://www.tensorzero.com/docs/quickstart)** \u00b7\n**[Comprehensive Tutorial](https://www.tensorzero.com/docs/gateway/tutorial)** \u00b7\n**[Deployment Guide](https://www.tensorzero.com/docs/gateway/deployment)** \u00b7\n**[API Reference](https://www.tensorzero.com/docs/gateway/api-reference/inference)** \u00b7\n**[Configuration Reference](https://www.tensorzero.com/docs/gateway/configuration-reference)**\n\nThe `tensorzero` package provides a Python client for the TensorZero Gateway.\nThis client allows you to easily make inference requests and assign feedback to them via the gateway.\n\nSee our **[API Reference](https://www.tensorzero.com/docs/gateway/api-reference)** for more information.\n\n## Installation\n\n```bash\npip install tensorzero\n```\n\n## Basic Usage\n\n### Initialization\n\nThe TensorZero client offers synchronous (`TensorZeroGateway`) and asynchronous (`AsyncTensorZeroGateway`) variants.\nAdditionally, the client can launch an embedded (in-memory) gateway (`build_embedded`) or connect to an external HTTP gateway (`build_http`) - both of these methods return a gateway instance.\n\nBy default, the asynchronous client returns a `Future` when you call `build_http` or `build_embedded`, so you must `await` it.\nIf you prefer to avoid the `await`, you can set `async_setup=False` to initialize the client in a blocking way.\n\n#### Synchronous HTTP Gateway\n\n```python\nfrom tensorzero import TensorZeroGateway\n\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    # ...\n```\n\n#### Asynchronous HTTP Gateway\n\n```python\nimport asyncio\n\nfrom tensorzero import AsyncTensorZeroGateway\n\n\nasync def run():\n    async with await AsyncTensorZeroGateway.build_http(\n        gateway_url=\"http://localhost:3000\",\n        # async_setup=False  # optional: skip the `await` and run `build_http` synchronously (blocking)\n    ) as client:\n        # ...\n\n\nif __name__ == \"__main__\":\n    asyncio.run(run())\n```\n\n#### Synchronous Embedded Gateway\n\n```python\nfrom tensorzero import TensorZeroGateway\n\nwith TensorZeroGateway.build_embedded(\n    config_file=\"/path/to/tensorzero.toml\",\n    clickhouse_url=\"http://chuser:chpassword@localhost:8123/tensorzero\"\n) as client:\n    # ...\n```\n\n#### Asynchronous Embedded Gateway\n\n```python\nimport asyncio\n\nfrom tensorzero import AsyncTensorZeroGateway\n\n\nasync def run():\n    async with await AsyncTensorZeroGateway.build_embedded(\n        config_file=\"/path/to/tensorzero.toml\",\n        clickhouse_url=\"http://chuser:chpassword@localhost:8123/tensorzero\"\n        # async_setup=False  # optional: skip the `await` and run `build_embedded` synchronously (blocking)\n    ) as client:\n        # ...\n\n\nif __name__ == \"__main__\":\n    asyncio.run(run())\n```\n\n### Inference\n\n#### Non-Streaming Inference with Synchronous Client\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    response = client.inference(\n        model_name=\"openai::gpt-4o-mini\",\n        input={\n            \"messages\": [\n                {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n            ],\n        },\n    )\n\n    print(response)\n```\n\n#### Non-Streaming Inference with Asynchronous Client\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    response = await client.inference(\n        model_name=\"openai::gpt-4o-mini\",\n        input={\n            \"messages\": [\n                {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n            ],\n        },\n    )\n\n    print(response)\n```\n\n#### Streaming Inference with Synchronous Client\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    stream = client.inference(\n        model_name=\"openai::gpt-4o-mini\",\n        input={\n            \"messages\": [\n                {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n            ],\n        },\n        stream=True,\n    )\n\n    for chunk in stream:\n        print(chunk)\n```\n\n#### Streaming Inference with Asynchronous Client\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    stream = await client.inference(\n        model_name=\"openai::gpt-4o-mini\",\n        input={\n            \"messages\": [{\"role\": \"user\", \"content\": \"What is the capital of Japan?\"}],\n        },\n        stream=True,\n    )\n\n    async for chunk in stream:\n        print(chunk)\n```\n\n### Feedback\n\n#### Synchronous\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    response = client.feedback(\n        metric_name=\"thumbs_up\",\n        inference_id=\"00000000-0000-0000-0000-000000000000\",\n        value=True,  # \ud83d\udc4d\n    )\n\n    print(response)\n```\n\n#### Asynchronous\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n    response = await client.feedback(\n        metric_name=\"thumbs_up\",\n        inference_id=\"00000000-0000-0000-0000-000000000000\",\n        value=True,  # \ud83d\udc4d\n    )\n\n    print(response)\n```\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "The Python client for TensorZero",
    "version": "2025.7.0",
    "project_urls": {
        "Documentation": "https://tensorzero.com/docs/",
        "Homepage": "https://tensorzero.com/",
        "Issues": "https://github.com/tensorzero/tensorzero/issues",
        "ReleaseNotes": "https://github.com/tensorzero/tensorzero/releases",
        "Repository": "https://github.com/tensorzero/tensorzero"
    },
    "split_keywords": [
        "artificial intelligence",
        " ai",
        " machine learning",
        " ml",
        " large language model",
        " large language models",
        " llm",
        " llms",
        " natural language processing",
        " nlp",
        " generative ai",
        " genai"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "073e7440a0490401fa61f3f7df19815f62479cbbd6d56e1592a19d9bf5dae4ef",
                "md5": "d583d0fab725ff9849463ed4a53c1d86",
                "sha256": "9c2ec77a40f5c128300dae741855300a9dcf1f62c1d08d0ac4065853f808b9a5"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-macosx_10_12_x86_64.whl",
            "has_sig": false,
            "md5_digest": "d583d0fab725ff9849463ed4a53c1d86",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 15548491,
            "upload_time": "2025-07-10T21:38:56",
            "upload_time_iso_8601": "2025-07-10T21:38:56.917333Z",
            "url": "https://files.pythonhosted.org/packages/07/3e/7440a0490401fa61f3f7df19815f62479cbbd6d56e1592a19d9bf5dae4ef/tensorzero-2025.7.0-cp39-abi3-macosx_10_12_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "41231eb50bdb3e3fa4b3ffdbdff6e7a5ad736772b0e08e438b0f94ce4b09085e",
                "md5": "a622149be29e9fb2c25222411ac878fa",
                "sha256": "3e4cbf805f3f2a05a5855729b2a14e9dc8dfc9a0a75f2964a18fc787053faa4e"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-macosx_11_0_arm64.whl",
            "has_sig": false,
            "md5_digest": "a622149be29e9fb2c25222411ac878fa",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 14722729,
            "upload_time": "2025-07-10T21:38:53",
            "upload_time_iso_8601": "2025-07-10T21:38:53.829008Z",
            "url": "https://files.pythonhosted.org/packages/41/23/1eb50bdb3e3fa4b3ffdbdff6e7a5ad736772b0e08e438b0f94ce4b09085e/tensorzero-2025.7.0-cp39-abi3-macosx_11_0_arm64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2151ea8fdb8591b44e1a1c2c62f8397e14a16ff05df7111dc4b3f04dd80d3d84",
                "md5": "93387dd7c365940456f8e316c9001e19",
                "sha256": "bc1ecf06a82a111fd4b2d3121c5d174395696c6031d0ceefdb754c1ba8b83b8e"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl",
            "has_sig": false,
            "md5_digest": "93387dd7c365940456f8e316c9001e19",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 15214253,
            "upload_time": "2025-07-10T21:38:48",
            "upload_time_iso_8601": "2025-07-10T21:38:48.859678Z",
            "url": "https://files.pythonhosted.org/packages/21/51/ea8fdb8591b44e1a1c2c62f8397e14a16ff05df7111dc4b3f04dd80d3d84/tensorzero-2025.7.0-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5efb1408ffe757fcd2c61627d84dc4a633985160152502f4e4967bea7c7f4ae8",
                "md5": "31f7293867d26106e0a447d32d8f2a82",
                "sha256": "e4bccd5e8089bb426f3fb3d237011730a380e1ae0387d53ea51fd5c4e1d32a4d"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "has_sig": false,
            "md5_digest": "31f7293867d26106e0a447d32d8f2a82",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 16371303,
            "upload_time": "2025-07-10T21:38:51",
            "upload_time_iso_8601": "2025-07-10T21:38:51.420805Z",
            "url": "https://files.pythonhosted.org/packages/5e/fb/1408ffe757fcd2c61627d84dc4a633985160152502f4e4967bea7c7f4ae8/tensorzero-2025.7.0-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9e43e63c07c2d1e7d37ea5a4da217d70487584f0460a7be27be61a1a7b47172a",
                "md5": "e61cb3425711e0493368f32438dbf8dd",
                "sha256": "8685f86bc1f77027b6a4bb1f0587e4ff275f752bbb081bbd4f3da04fe2fbbea1"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-musllinux_1_2_aarch64.whl",
            "has_sig": false,
            "md5_digest": "e61cb3425711e0493368f32438dbf8dd",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 15301618,
            "upload_time": "2025-07-10T21:38:58",
            "upload_time_iso_8601": "2025-07-10T21:38:58.900656Z",
            "url": "https://files.pythonhosted.org/packages/9e/43/e63c07c2d1e7d37ea5a4da217d70487584f0460a7be27be61a1a7b47172a/tensorzero-2025.7.0-cp39-abi3-musllinux_1_2_aarch64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9b618195c258e493c64e8098d5598b0b65c34f19bf1753da2b17bca1cfe637d4",
                "md5": "41efb29f8f628aff33e290c8f53714e8",
                "sha256": "c1af6421689d9926e35ccf6bd4db732b1d4c453360f3375d1af9e04c59b1da09"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-musllinux_1_2_x86_64.whl",
            "has_sig": false,
            "md5_digest": "41efb29f8f628aff33e290c8f53714e8",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 16573057,
            "upload_time": "2025-07-10T21:39:01",
            "upload_time_iso_8601": "2025-07-10T21:39:01.253798Z",
            "url": "https://files.pythonhosted.org/packages/9b/61/8195c258e493c64e8098d5598b0b65c34f19bf1753da2b17bca1cfe637d4/tensorzero-2025.7.0-cp39-abi3-musllinux_1_2_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "958f38392c7845ad61570bc433082741e53e8a541edf8af243274b426f2f6bb5",
                "md5": "91d5655e0ed4fbd07cd04c4aa497ea4f",
                "sha256": "b5e2ee8fbb4e2f4a1afcbc26b2e65fc8708a13e964e193bcb8ce1b4151ea90e1"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0-cp39-abi3-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "91d5655e0ed4fbd07cd04c4aa497ea4f",
            "packagetype": "bdist_wheel",
            "python_version": "cp39",
            "requires_python": ">=3.9",
            "size": 16330428,
            "upload_time": "2025-07-10T21:39:05",
            "upload_time_iso_8601": "2025-07-10T21:39:05.028395Z",
            "url": "https://files.pythonhosted.org/packages/95/8f/38392c7845ad61570bc433082741e53e8a541edf8af243274b426f2f6bb5/tensorzero-2025.7.0-cp39-abi3-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "783194705468a0ba8f48fa5d64b1825c461ab404ac4e30feed649de93d0abc3f",
                "md5": "4e8dbab81e2ee119fc22f9f177b047f9",
                "sha256": "4d31f965ad95eb25d9b948d1245e77500624fa355da95d5c7f455bb32f1298f7"
            },
            "downloads": -1,
            "filename": "tensorzero-2025.7.0.tar.gz",
            "has_sig": false,
            "md5_digest": "4e8dbab81e2ee119fc22f9f177b047f9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 1178085,
            "upload_time": "2025-07-10T21:39:03",
            "upload_time_iso_8601": "2025-07-10T21:39:03.468872Z",
            "url": "https://files.pythonhosted.org/packages/78/31/94705468a0ba8f48fa5d64b1825c461ab404ac4e30feed649de93d0abc3f/tensorzero-2025.7.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-10 21:39:03",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tensorzero",
    "github_project": "tensorzero",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [],
    "lcname": "tensorzero"
}
        
Elapsed time: 0.42983s