# TensorZero Python Client
**[Website](https://www.tensorzero.com/)** ·
**[Docs](https://www.tensorzero.com/docs)** ·
**[Twitter](https://www.x.com/tensorzero)** ·
**[Slack](https://www.tensorzero.com/slack)** ·
**[Discord](https://www.tensorzero.com/discord)**
**[Quick Start (5min)](https://www.tensorzero.com/docs/quickstart)** ·
**[Comprehensive Tutorial](https://www.tensorzero.com/docs/gateway/tutorial)** ·
**[Deployment Guide](https://www.tensorzero.com/docs/gateway/deployment)** ·
**[API Reference](https://www.tensorzero.com/docs/gateway/api-reference/inference)** ·
**[Configuration Reference](https://www.tensorzero.com/docs/gateway/configuration-reference)**
The `tensorzero` package provides a Python client for the TensorZero Gateway.
This client allows you to easily make inference requests and assign feedback to them via the gateway.
See our **[API Reference](https://www.tensorzero.com/docs/gateway/api-reference)** for more information.
## Installation
```bash
pip install tensorzero
```
## Basic Usage
### Initialization
The TensorZero client offers synchronous (`TensorZeroGateway`) and asynchronous (`AsyncTensorZeroGateway`) variants.
Additionally, the client can launch an embedded (in-memory) gateway (`build_embedded`) or connect to an external HTTP gateway (`build_http`) - both of these methods return a gateway instance.
By default, the asynchronous client returns a `Future` when you call `build_http` or `build_embedded`, so you must `await` it.
If you prefer to avoid the `await`, you can set `async_setup=False` to initialize the client in a blocking way.
#### Synchronous HTTP Gateway
```python
from tensorzero import TensorZeroGateway
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
# ...
```
#### Asynchronous HTTP Gateway
```python
import asyncio
from tensorzero import AsyncTensorZeroGateway
async def run():
async with await AsyncTensorZeroGateway.build_http(
gateway_url="http://localhost:3000",
# async_setup=False # optional: skip the `await` and run `build_http` synchronously (blocking)
) as client:
# ...
if __name__ == "__main__":
asyncio.run(run())
```
#### Synchronous Embedded Gateway
```python
from tensorzero import TensorZeroGateway
with TensorZeroGateway.build_embedded(
config_file="/path/to/tensorzero.toml",
clickhouse_url="http://chuser:chpassword@localhost:8123/tensorzero"
) as client:
# ...
```
#### Asynchronous Embedded Gateway
```python
import asyncio
from tensorzero import AsyncTensorZeroGateway
async def run():
async with await AsyncTensorZeroGateway.build_embedded(
config_file="/path/to/tensorzero.toml",
clickhouse_url="http://chuser:chpassword@localhost:8123/tensorzero"
# async_setup=False # optional: skip the `await` and run `build_embedded` synchronously (blocking)
) as client:
# ...
if __name__ == "__main__":
asyncio.run(run())
```
### Inference
#### Non-Streaming Inference with Synchronous Client
```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
response = client.inference(
model_name="openai::gpt-4o-mini",
input={
"messages": [
{"role": "user", "content": "What is the capital of Japan?"},
],
},
)
print(response)
```
#### Non-Streaming Inference with Asynchronous Client
```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
response = await client.inference(
model_name="openai::gpt-4o-mini",
input={
"messages": [
{"role": "user", "content": "What is the capital of Japan?"},
],
},
)
print(response)
```
#### Streaming Inference with Synchronous Client
```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
stream = client.inference(
model_name="openai::gpt-4o-mini",
input={
"messages": [
{"role": "user", "content": "What is the capital of Japan?"},
],
},
stream=True,
)
for chunk in stream:
print(chunk)
```
#### Streaming Inference with Asynchronous Client
```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
stream = await client.inference(
model_name="openai::gpt-4o-mini",
input={
"messages": [{"role": "user", "content": "What is the capital of Japan?"}],
},
stream=True,
)
async for chunk in stream:
print(chunk)
```
### Feedback
#### Synchronous
```python
with TensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
response = client.feedback(
metric_name="thumbs_up",
inference_id="00000000-0000-0000-0000-000000000000",
value=True, # 👍
)
print(response)
```
#### Asynchronous
```python
async with await AsyncTensorZeroGateway.build_http(gateway_url="http://localhost:3000") as client:
response = await client.feedback(
metric_name="thumbs_up",
inference_id="00000000-0000-0000-0000-000000000000",
value=True, # 👍
)
print(response)
```
Raw data
{
"_id": null,
"home_page": null,
"name": "tensorzero",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "artificial intelligence, ai, machine learning, ml, large language model, large language models, llm, llms, natural language processing, NLP, generative ai, genai",
"author": null,
"author_email": "Viraj Mehta <viraj@tensorzero.com>, Gabriel Bianconi <gabriel@tensorzero.com>, Aaron Hill <aaron@tensorzero.com>",
"download_url": "https://files.pythonhosted.org/packages/5d/63/188bb1f520123008be982f815b7c35234d924d90c1284016ecc6c95886d9/tensorzero-2025.7.5.tar.gz",
"platform": null,
"description": "# TensorZero Python Client\n\n**[Website](https://www.tensorzero.com/)** \u00b7\n**[Docs](https://www.tensorzero.com/docs)** \u00b7\n**[Twitter](https://www.x.com/tensorzero)** \u00b7\n**[Slack](https://www.tensorzero.com/slack)** \u00b7\n**[Discord](https://www.tensorzero.com/discord)**\n\n**[Quick Start (5min)](https://www.tensorzero.com/docs/quickstart)** \u00b7\n**[Comprehensive Tutorial](https://www.tensorzero.com/docs/gateway/tutorial)** \u00b7\n**[Deployment Guide](https://www.tensorzero.com/docs/gateway/deployment)** \u00b7\n**[API Reference](https://www.tensorzero.com/docs/gateway/api-reference/inference)** \u00b7\n**[Configuration Reference](https://www.tensorzero.com/docs/gateway/configuration-reference)**\n\nThe `tensorzero` package provides a Python client for the TensorZero Gateway.\nThis client allows you to easily make inference requests and assign feedback to them via the gateway.\n\nSee our **[API Reference](https://www.tensorzero.com/docs/gateway/api-reference)** for more information.\n\n## Installation\n\n```bash\npip install tensorzero\n```\n\n## Basic Usage\n\n### Initialization\n\nThe TensorZero client offers synchronous (`TensorZeroGateway`) and asynchronous (`AsyncTensorZeroGateway`) variants.\nAdditionally, the client can launch an embedded (in-memory) gateway (`build_embedded`) or connect to an external HTTP gateway (`build_http`) - both of these methods return a gateway instance.\n\nBy default, the asynchronous client returns a `Future` when you call `build_http` or `build_embedded`, so you must `await` it.\nIf you prefer to avoid the `await`, you can set `async_setup=False` to initialize the client in a blocking way.\n\n#### Synchronous HTTP Gateway\n\n```python\nfrom tensorzero import TensorZeroGateway\n\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n # ...\n```\n\n#### Asynchronous HTTP Gateway\n\n```python\nimport asyncio\n\nfrom tensorzero import AsyncTensorZeroGateway\n\n\nasync def run():\n async with await AsyncTensorZeroGateway.build_http(\n gateway_url=\"http://localhost:3000\",\n # async_setup=False # optional: skip the `await` and run `build_http` synchronously (blocking)\n ) as client:\n # ...\n\n\nif __name__ == \"__main__\":\n asyncio.run(run())\n```\n\n#### Synchronous Embedded Gateway\n\n```python\nfrom tensorzero import TensorZeroGateway\n\nwith TensorZeroGateway.build_embedded(\n config_file=\"/path/to/tensorzero.toml\",\n clickhouse_url=\"http://chuser:chpassword@localhost:8123/tensorzero\"\n) as client:\n # ...\n```\n\n#### Asynchronous Embedded Gateway\n\n```python\nimport asyncio\n\nfrom tensorzero import AsyncTensorZeroGateway\n\n\nasync def run():\n async with await AsyncTensorZeroGateway.build_embedded(\n config_file=\"/path/to/tensorzero.toml\",\n clickhouse_url=\"http://chuser:chpassword@localhost:8123/tensorzero\"\n # async_setup=False # optional: skip the `await` and run `build_embedded` synchronously (blocking)\n ) as client:\n # ...\n\n\nif __name__ == \"__main__\":\n asyncio.run(run())\n```\n\n### Inference\n\n#### Non-Streaming Inference with Synchronous Client\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n response = client.inference(\n model_name=\"openai::gpt-4o-mini\",\n input={\n \"messages\": [\n {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n ],\n },\n )\n\n print(response)\n```\n\n#### Non-Streaming Inference with Asynchronous Client\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n response = await client.inference(\n model_name=\"openai::gpt-4o-mini\",\n input={\n \"messages\": [\n {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n ],\n },\n )\n\n print(response)\n```\n\n#### Streaming Inference with Synchronous Client\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n stream = client.inference(\n model_name=\"openai::gpt-4o-mini\",\n input={\n \"messages\": [\n {\"role\": \"user\", \"content\": \"What is the capital of Japan?\"},\n ],\n },\n stream=True,\n )\n\n for chunk in stream:\n print(chunk)\n```\n\n#### Streaming Inference with Asynchronous Client\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n stream = await client.inference(\n model_name=\"openai::gpt-4o-mini\",\n input={\n \"messages\": [{\"role\": \"user\", \"content\": \"What is the capital of Japan?\"}],\n },\n stream=True,\n )\n\n async for chunk in stream:\n print(chunk)\n```\n\n### Feedback\n\n#### Synchronous\n\n```python\nwith TensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n response = client.feedback(\n metric_name=\"thumbs_up\",\n inference_id=\"00000000-0000-0000-0000-000000000000\",\n value=True, # \ud83d\udc4d\n )\n\n print(response)\n```\n\n#### Asynchronous\n\n```python\nasync with await AsyncTensorZeroGateway.build_http(gateway_url=\"http://localhost:3000\") as client:\n response = await client.feedback(\n metric_name=\"thumbs_up\",\n inference_id=\"00000000-0000-0000-0000-000000000000\",\n value=True, # \ud83d\udc4d\n )\n\n print(response)\n```\n\n",
"bugtrack_url": null,
"license": null,
"summary": "The Python client for TensorZero",
"version": "2025.7.5",
"project_urls": {
"Documentation": "https://tensorzero.com/docs/",
"Homepage": "https://tensorzero.com/",
"Issues": "https://github.com/tensorzero/tensorzero/issues",
"ReleaseNotes": "https://github.com/tensorzero/tensorzero/releases",
"Repository": "https://github.com/tensorzero/tensorzero"
},
"split_keywords": [
"artificial intelligence",
" ai",
" machine learning",
" ml",
" large language model",
" large language models",
" llm",
" llms",
" natural language processing",
" nlp",
" generative ai",
" genai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "122f7d57e631f3c14c585dbefb779d11e441c2f22cd03748c75375b0ad08d1ea",
"md5": "35de15a8b348a0b37f21f426961f6bfa",
"sha256": "40b589770c86cea5942d144300f381d851351defa9efd0986a0d87b8735f7a07"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-macosx_10_12_x86_64.whl",
"has_sig": false,
"md5_digest": "35de15a8b348a0b37f21f426961f6bfa",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 16389069,
"upload_time": "2025-07-30T16:23:57",
"upload_time_iso_8601": "2025-07-30T16:23:57.282812Z",
"url": "https://files.pythonhosted.org/packages/12/2f/7d57e631f3c14c585dbefb779d11e441c2f22cd03748c75375b0ad08d1ea/tensorzero-2025.7.5-cp39-abi3-macosx_10_12_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c4733673c9f30e81f3107db3a6a600c8846c9b2edd57b9dcb15ea4c03182dd23",
"md5": "54ec054138f7496bf53c2b623b3bade8",
"sha256": "a24fed842f2485be39bcbf1c8280a2538e6bfdbd3ab615e2583ae9c86743dd9d"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-macosx_11_0_arm64.whl",
"has_sig": false,
"md5_digest": "54ec054138f7496bf53c2b623b3bade8",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 15522191,
"upload_time": "2025-07-30T16:23:54",
"upload_time_iso_8601": "2025-07-30T16:23:54.692941Z",
"url": "https://files.pythonhosted.org/packages/c4/73/3673c9f30e81f3107db3a6a600c8846c9b2edd57b9dcb15ea4c03182dd23/tensorzero-2025.7.5-cp39-abi3-macosx_11_0_arm64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "940d0d604dbe1089f482767fb8fc227b381d922df72108e6ace87f1884cb4db4",
"md5": "0f2b5e66ebf2d1a220915e88d13ade5c",
"sha256": "b32b47e1a5f1f6c769eb067698a8ad804f6189f1588e0f4e45445ee9dc329164"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl",
"has_sig": false,
"md5_digest": "0f2b5e66ebf2d1a220915e88d13ade5c",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 16034337,
"upload_time": "2025-07-30T16:23:47",
"upload_time_iso_8601": "2025-07-30T16:23:47.152895Z",
"url": "https://files.pythonhosted.org/packages/94/0d/0d604dbe1089f482767fb8fc227b381d922df72108e6ace87f1884cb4db4/tensorzero-2025.7.5-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "fa81a6ad537839c874c9b03ce5473b4bcc4348f58fa7d6e63baba9f425d98c1c",
"md5": "ca2443a3d139cffee1186c4362a1d811",
"sha256": "9338617764a65d0be9482246d84ddc9a76d9c6524abd1e4d10db48f3a2abb180"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"has_sig": false,
"md5_digest": "ca2443a3d139cffee1186c4362a1d811",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 17233682,
"upload_time": "2025-07-30T16:23:50",
"upload_time_iso_8601": "2025-07-30T16:23:50.139260Z",
"url": "https://files.pythonhosted.org/packages/fa/81/a6ad537839c874c9b03ce5473b4bcc4348f58fa7d6e63baba9f425d98c1c/tensorzero-2025.7.5-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "e9b44c43957672ad7bf4d49050c67ddf0ed3b31dfe2ccd990a1d9bc04241e61c",
"md5": "e92dd0feda53f52409c7f2855473c342",
"sha256": "db6fbc8b522f43da219ab9f71c2177295fc6820e9168a98b94facb75317987ab"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-musllinux_1_2_aarch64.whl",
"has_sig": false,
"md5_digest": "e92dd0feda53f52409c7f2855473c342",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 16112384,
"upload_time": "2025-07-30T16:23:59",
"upload_time_iso_8601": "2025-07-30T16:23:59.980641Z",
"url": "https://files.pythonhosted.org/packages/e9/b4/4c43957672ad7bf4d49050c67ddf0ed3b31dfe2ccd990a1d9bc04241e61c/tensorzero-2025.7.5-cp39-abi3-musllinux_1_2_aarch64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "7ba7936433b56a6506c1b6ee0476c41e39539fb14dca54aefacb30179bc0b086",
"md5": "afa23b6adb2b29ddf515826a6deee91c",
"sha256": "93d4e17147f9449df8cf6aad0f18c936f1170c0cb59b07760dd09abb47a29b40"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-musllinux_1_2_x86_64.whl",
"has_sig": false,
"md5_digest": "afa23b6adb2b29ddf515826a6deee91c",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 17445354,
"upload_time": "2025-07-30T16:24:02",
"upload_time_iso_8601": "2025-07-30T16:24:02.430363Z",
"url": "https://files.pythonhosted.org/packages/7b/a7/936433b56a6506c1b6ee0476c41e39539fb14dca54aefacb30179bc0b086/tensorzero-2025.7.5-cp39-abi3-musllinux_1_2_x86_64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "f4fd88f4368b71ae8c4bd1e3ed99c1660467760ca6cfbd31d9167f3a010f9d02",
"md5": "c172c3c44f052a7c0f6a5acc28ce56cc",
"sha256": "a80d9739c61c8d839f8d4f9f61d6333ca13b2bd7ea1bb021ea989dd15a8eb39e"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5-cp39-abi3-win_amd64.whl",
"has_sig": false,
"md5_digest": "c172c3c44f052a7c0f6a5acc28ce56cc",
"packagetype": "bdist_wheel",
"python_version": "cp39",
"requires_python": ">=3.9",
"size": 17174978,
"upload_time": "2025-07-30T16:24:08",
"upload_time_iso_8601": "2025-07-30T16:24:08.122215Z",
"url": "https://files.pythonhosted.org/packages/f4/fd/88f4368b71ae8c4bd1e3ed99c1660467760ca6cfbd31d9167f3a010f9d02/tensorzero-2025.7.5-cp39-abi3-win_amd64.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5d63188bb1f520123008be982f815b7c35234d924d90c1284016ecc6c95886d9",
"md5": "11b0c369bdb789074094c24177b5ec73",
"sha256": "cb366f3c355524e3e0a2a3a2a80e96454d2e5816e2789fb8b93de03874318383"
},
"downloads": -1,
"filename": "tensorzero-2025.7.5.tar.gz",
"has_sig": false,
"md5_digest": "11b0c369bdb789074094c24177b5ec73",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 1218364,
"upload_time": "2025-07-30T16:24:04",
"upload_time_iso_8601": "2025-07-30T16:24:04.804160Z",
"url": "https://files.pythonhosted.org/packages/5d/63/188bb1f520123008be982f815b7c35234d924d90c1284016ecc6c95886d9/tensorzero-2025.7.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-30 16:24:04",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "tensorzero",
"github_project": "tensorzero",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "tensorzero"
}