livepeer-ai


Namelivepeer-ai JSON
Version 0.9.0 PyPI version JSON
download
home_pagehttps://github.com/livepeer/livepeer-ai-python.git
SummaryPython Client SDK for the Livepeer AI API.
upload_time2024-12-31 09:11:40
maintainerNone
docs_urlNone
authorSpeakeasy
requires_python<4.0,>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Livepeer AI Python Library

<div align="left">
    <a href="https://www.speakeasy.com/?utm_source=livepeer-ai&utm_campaign=python"><img src="https://custom-icon-badges.demolab.com/badge/-Built%20By%20Speakeasy-212015?style=for-the-badge&logoColor=FBE331&logo=speakeasy&labelColor=545454" /></a>
    <a href="https://opensource.org/licenses/MIT">
        <img src="https://img.shields.io/badge/License-MIT-blue.svg" style="width: 100px; height: 28px;" />
    </a>
</div>

Welcome to the [Livepeer AI](https://livepeer.ai/) Python! This library offers a seamless integration with the [Livepeer AI API](https://docs.livepeer.org/ai/api-reference/text-to-image), enabling you to easily incorporate powerful AI capabilities into your Python applications, whether they run in the browser or on the server side.

<!-- Start SDK Installation [installation] -->
## SDK Installation

The SDK can be installed with either *pip* or *poetry* package managers.

### PIP

*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.

```bash
pip install livepeer-ai
```

### Poetry

*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.

```bash
poetry add livepeer-ai
```
<!-- End SDK Installation [installation] -->

<!-- Start IDE Support [idesupport] -->
## IDE Support

### PyCharm

Generally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.

- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)
<!-- End IDE Support [idesupport] -->

<!-- Start SDK Example Usage [usage] -->
## SDK Example Usage

### Example

```python
# Synchronous Example
from livepeer_ai import Livepeer

with Livepeer(
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)
```

</br>

The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from livepeer_ai import Livepeer

async def main():
    async with Livepeer(
        http_bearer="<YOUR_BEARER_TOKEN_HERE>",
    ) as livepeer:

        res = await livepeer.generate.text_to_image_async(request={
            "prompt": "<value>",
        })

        assert res.image_response is not None

        # Handle response
        print(res.image_response)

asyncio.run(main())
```
<!-- End SDK Example Usage [usage] -->

<!-- Start Available Resources and Operations [operations] -->
## Available Resources and Operations

<details open>
<summary>Available methods</summary>

### [generate](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md)

* [text_to_image](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#text_to_image) - Text To Image
* [image_to_image](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_image) - Image To Image
* [image_to_video](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_video) - Image To Video
* [upscale](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#upscale) - Upscale
* [audio_to_text](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#audio_to_text) - Audio To Text
* [segment_anything2](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#segment_anything2) - Segment Anything 2
* [llm](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#llm) - LLM
* [image_to_text](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_text) - Image To Text
* [live_video_to_video](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#live_video_to_video) - Live Video To Video
* [text_to_speech](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#text_to_speech) - Text To Speech


</details>
<!-- End Available Resources and Operations [operations] -->

<!-- Start File uploads [file-upload] -->
## File uploads

Certain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.

> [!TIP]
>
> For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.
>

```python
from livepeer_ai import Livepeer

with Livepeer(
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.image_to_image(request={
        "prompt": "<value>",
        "image": {
            "file_name": "example.file",
            "content": open("example.file", "rb"),
        },
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```
<!-- End File uploads [file-upload] -->

<!-- Start Retries [retries] -->
## Retries

Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.

To change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:
```python
from livepeer_ai import Livepeer
from livepeer_ai.utils import BackoffStrategy, RetryConfig

with Livepeer(
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    },
        RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False))

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```

If you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:
```python
from livepeer_ai import Livepeer
from livepeer_ai.utils import BackoffStrategy, RetryConfig

with Livepeer(
    retry_config=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```
<!-- End Retries [retries] -->

<!-- Start Error Handling [errors] -->
## Error Handling

Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.

By default, an API error will raise a errors.SDKError exception, which has the following properties:

| Property        | Type             | Description           |
|-----------------|------------------|-----------------------|
| `.status_code`  | *int*            | The HTTP status code  |
| `.message`      | *str*            | The error message     |
| `.raw_response` | *httpx.Response* | The raw HTTP response |
| `.body`         | *str*            | The response content  |

When custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `text_to_image_async` method may raise the following exceptions:

| Error Type                 | Status Code   | Content Type     |
| -------------------------- | ------------- | ---------------- |
| errors.HTTPError           | 400, 401, 500 | application/json |
| errors.HTTPValidationError | 422           | application/json |
| errors.SDKError            | 4XX, 5XX      | \*/\*            |

### Example

```python
from livepeer_ai import Livepeer
from livepeer_ai.models import errors

with Livepeer(
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:
    res = None
    try:

        res = livepeer.generate.text_to_image(request={
            "prompt": "<value>",
        })

        assert res.image_response is not None

        # Handle response
        print(res.image_response)

    except errors.HTTPError as e:
        # handle e.data: errors.HTTPErrorData
        raise(e)
    except errors.HTTPValidationError as e:
        # handle e.data: errors.HTTPValidationErrorData
        raise(e)
    except errors.SDKError as e:
        # handle exception
        raise(e)
```
<!-- End Error Handling [errors] -->

<!-- Start Server Selection [server] -->
## Server Selection

### Select Server by Index

You can override the default server globally by passing a server index to the `server_idx: int` optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the indexes associated with the available servers:

| #   | Server                                      |
| --- | ------------------------------------------- |
| 0   | `https://dream-gateway.livepeer.cloud`      |
| 1   | `https://livepeer.studio/api/beta/generate` |

#### Example

```python
from livepeer_ai import Livepeer

with Livepeer(
    server_idx=1,
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```

### Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:
```python
from livepeer_ai import Livepeer

with Livepeer(
    server_url="https://dream-gateway.livepeer.cloud",
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```
<!-- End Server Selection [server] -->

<!-- Start Custom HTTP Client [http-client] -->
## Custom HTTP Client

The Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library.  In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.
Depending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.
This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.

For example, you could specify a header for every request that this sdk makes as follows:
```python
from livepeer_ai import Livepeer
import httpx

http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = Livepeer(client=http_client)
```

or you could wrap the client with your own custom logic:
```python
from livepeer_ai import Livepeer
from livepeer_ai.httpclient import AsyncHttpClient
import httpx

class CustomClient(AsyncHttpClient):
    client: AsyncHttpClient

    def __init__(self, client: AsyncHttpClient):
        self.client = client

    async def send(
        self,
        request: httpx.Request,
        *,
        stream: bool = False,
        auth: Union[
            httpx._types.AuthTypes, httpx._client.UseClientDefault, None
        ] = httpx.USE_CLIENT_DEFAULT,
        follow_redirects: Union[
            bool, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
    ) -> httpx.Response:
        request.headers["Client-Level-Header"] = "added by client"

        return await self.client.send(
            request, stream=stream, auth=auth, follow_redirects=follow_redirects
        )

    def build_request(
        self,
        method: str,
        url: httpx._types.URLTypes,
        *,
        content: Optional[httpx._types.RequestContent] = None,
        data: Optional[httpx._types.RequestData] = None,
        files: Optional[httpx._types.RequestFiles] = None,
        json: Optional[Any] = None,
        params: Optional[httpx._types.QueryParamTypes] = None,
        headers: Optional[httpx._types.HeaderTypes] = None,
        cookies: Optional[httpx._types.CookieTypes] = None,
        timeout: Union[
            httpx._types.TimeoutTypes, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
        extensions: Optional[httpx._types.RequestExtensions] = None,
    ) -> httpx.Request:
        return self.client.build_request(
            method,
            url,
            content=content,
            data=data,
            files=files,
            json=json,
            params=params,
            headers=headers,
            cookies=cookies,
            timeout=timeout,
            extensions=extensions,
        )

s = Livepeer(async_client=CustomClient(httpx.AsyncClient()))
```
<!-- End Custom HTTP Client [http-client] -->

<!-- Start Authentication [security] -->
## Authentication

### Per-Client Security Schemes

This SDK supports the following security scheme globally:

| Name          | Type | Scheme      |
| ------------- | ---- | ----------- |
| `http_bearer` | http | HTTP Bearer |

To authenticate with the API the `http_bearer` parameter must be set when initializing the SDK client instance. For example:
```python
from livepeer_ai import Livepeer

with Livepeer(
    http_bearer="<YOUR_BEARER_TOKEN_HERE>",
) as livepeer:

    res = livepeer.generate.text_to_image(request={
        "prompt": "<value>",
    })

    assert res.image_response is not None

    # Handle response
    print(res.image_response)

```
<!-- End Authentication [security] -->

<!-- Start Debugging [debug] -->
## Debugging

You can setup your SDK to emit debug logs for SDK requests and responses.

You can pass your own logger class directly into your SDK.
```python
from livepeer_ai import Livepeer
import logging

logging.basicConfig(level=logging.DEBUG)
s = Livepeer(debug_logger=logging.getLogger("livepeer_ai"))
```
<!-- End Debugging [debug] -->

<!-- Start Summary [summary] -->
## Summary

Livepeer AI Runner: An application to run AI pipelines
<!-- End Summary [summary] -->

<!-- Start Table of Contents [toc] -->
## Table of Contents
<!-- $toc-max-depth=2 -->
* [Livepeer AI Python Library](https://github.com/livepeer/livepeer-ai-python/blob/master/#livepeer-ai-python-library)
  * [SDK Installation](https://github.com/livepeer/livepeer-ai-python/blob/master/#sdk-installation)
  * [IDE Support](https://github.com/livepeer/livepeer-ai-python/blob/master/#ide-support)
  * [SDK Example Usage](https://github.com/livepeer/livepeer-ai-python/blob/master/#sdk-example-usage)
  * [Available Resources and Operations](https://github.com/livepeer/livepeer-ai-python/blob/master/#available-resources-and-operations)
  * [File uploads](https://github.com/livepeer/livepeer-ai-python/blob/master/#file-uploads)
  * [Retries](https://github.com/livepeer/livepeer-ai-python/blob/master/#retries)
  * [Error Handling](https://github.com/livepeer/livepeer-ai-python/blob/master/#error-handling)
  * [Server Selection](https://github.com/livepeer/livepeer-ai-python/blob/master/#server-selection)
  * [Custom HTTP Client](https://github.com/livepeer/livepeer-ai-python/blob/master/#custom-http-client)
  * [Authentication](https://github.com/livepeer/livepeer-ai-python/blob/master/#authentication)
  * [Debugging](https://github.com/livepeer/livepeer-ai-python/blob/master/#debugging)
* [Development](https://github.com/livepeer/livepeer-ai-python/blob/master/#development)
  * [Maturity](https://github.com/livepeer/livepeer-ai-python/blob/master/#maturity)
  * [Contributions](https://github.com/livepeer/livepeer-ai-python/blob/master/#contributions)

<!-- End Table of Contents [toc] -->

<!-- Placeholder for Future Speakeasy SDK Sections -->

# Development

## Maturity

This SDK is in **alpha**, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage
to a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally
looking for the latest version.

## Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. We look forward to hearing your feedback. Feel free to open a [PR](https://github.com/livepeer/livepeer-ai-python/compare) or [an issue](https://github.com/livepeer/livepeer-ai-python/issues) with a proof of concept and we'll do our best to include it in a future release.

### SDK Created by [Speakeasy](https://www.speakeasy.com/?utm_source=livepeer-ai&utm_campaign=python)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/livepeer/livepeer-ai-python.git",
    "name": "livepeer-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Speakeasy",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/98/ad/f1b10b79f145df24ab75209ec6f13137f5e4fc3fc43687b74ff65ccc28e3/livepeer_ai-0.9.0.tar.gz",
    "platform": null,
    "description": "# Livepeer AI Python Library\n\n<div align=\"left\">\n    <a href=\"https://www.speakeasy.com/?utm_source=livepeer-ai&utm_campaign=python\"><img src=\"https://custom-icon-badges.demolab.com/badge/-Built%20By%20Speakeasy-212015?style=for-the-badge&logoColor=FBE331&logo=speakeasy&labelColor=545454\" /></a>\n    <a href=\"https://opensource.org/licenses/MIT\">\n        <img src=\"https://img.shields.io/badge/License-MIT-blue.svg\" style=\"width: 100px; height: 28px;\" />\n    </a>\n</div>\n\nWelcome to the [Livepeer AI](https://livepeer.ai/) Python! This library offers a seamless integration with the [Livepeer AI API](https://docs.livepeer.org/ai/api-reference/text-to-image), enabling you to easily incorporate powerful AI capabilities into your Python applications, whether they run in the browser or on the server side.\n\n<!-- Start SDK Installation [installation] -->\n## SDK Installation\n\nThe SDK can be installed with either *pip* or *poetry* package managers.\n\n### PIP\n\n*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.\n\n```bash\npip install livepeer-ai\n```\n\n### Poetry\n\n*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.\n\n```bash\npoetry add livepeer-ai\n```\n<!-- End SDK Installation [installation] -->\n\n<!-- Start IDE Support [idesupport] -->\n## IDE Support\n\n### PyCharm\n\nGenerally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.\n\n- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)\n<!-- End IDE Support [idesupport] -->\n\n<!-- Start SDK Example Usage [usage] -->\n## SDK Example Usage\n\n### Example\n\n```python\n# Synchronous Example\nfrom livepeer_ai import Livepeer\n\nwith Livepeer(\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom livepeer_ai import Livepeer\n\nasync def main():\n    async with Livepeer(\n        http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n    ) as livepeer:\n\n        res = await livepeer.generate.text_to_image_async(request={\n            \"prompt\": \"<value>\",\n        })\n\n        assert res.image_response is not None\n\n        # Handle response\n        print(res.image_response)\n\nasyncio.run(main())\n```\n<!-- End SDK Example Usage [usage] -->\n\n<!-- Start Available Resources and Operations [operations] -->\n## Available Resources and Operations\n\n<details open>\n<summary>Available methods</summary>\n\n### [generate](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md)\n\n* [text_to_image](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#text_to_image) - Text To Image\n* [image_to_image](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_image) - Image To Image\n* [image_to_video](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_video) - Image To Video\n* [upscale](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#upscale) - Upscale\n* [audio_to_text](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#audio_to_text) - Audio To Text\n* [segment_anything2](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#segment_anything2) - Segment Anything 2\n* [llm](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#llm) - LLM\n* [image_to_text](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#image_to_text) - Image To Text\n* [live_video_to_video](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#live_video_to_video) - Live Video To Video\n* [text_to_speech](https://github.com/livepeer/livepeer-ai-python/blob/master/docs/sdks/generate/README.md#text_to_speech) - Text To Speech\n\n\n</details>\n<!-- End Available Resources and Operations [operations] -->\n\n<!-- Start File uploads [file-upload] -->\n## File uploads\n\nCertain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.\n\n> [!TIP]\n>\n> For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.\n>\n\n```python\nfrom livepeer_ai import Livepeer\n\nwith Livepeer(\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.image_to_image(request={\n        \"prompt\": \"<value>\",\n        \"image\": {\n            \"file_name\": \"example.file\",\n            \"content\": open(\"example.file\", \"rb\"),\n        },\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n<!-- End File uploads [file-upload] -->\n\n<!-- Start Retries [retries] -->\n## Retries\n\nSome of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.\n\nTo change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:\n```python\nfrom livepeer_ai import Livepeer\nfrom livepeer_ai.utils import BackoffStrategy, RetryConfig\n\nwith Livepeer(\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    },\n        RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False))\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n\nIf you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:\n```python\nfrom livepeer_ai import Livepeer\nfrom livepeer_ai.utils import BackoffStrategy, RetryConfig\n\nwith Livepeer(\n    retry_config=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n<!-- End Retries [retries] -->\n\n<!-- Start Error Handling [errors] -->\n## Error Handling\n\nHandling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.\n\nBy default, an API error will raise a errors.SDKError exception, which has the following properties:\n\n| Property        | Type             | Description           |\n|-----------------|------------------|-----------------------|\n| `.status_code`  | *int*            | The HTTP status code  |\n| `.message`      | *str*            | The error message     |\n| `.raw_response` | *httpx.Response* | The raw HTTP response |\n| `.body`         | *str*            | The response content  |\n\nWhen custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `text_to_image_async` method may raise the following exceptions:\n\n| Error Type                 | Status Code   | Content Type     |\n| -------------------------- | ------------- | ---------------- |\n| errors.HTTPError           | 400, 401, 500 | application/json |\n| errors.HTTPValidationError | 422           | application/json |\n| errors.SDKError            | 4XX, 5XX      | \\*/\\*            |\n\n### Example\n\n```python\nfrom livepeer_ai import Livepeer\nfrom livepeer_ai.models import errors\n\nwith Livepeer(\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n    res = None\n    try:\n\n        res = livepeer.generate.text_to_image(request={\n            \"prompt\": \"<value>\",\n        })\n\n        assert res.image_response is not None\n\n        # Handle response\n        print(res.image_response)\n\n    except errors.HTTPError as e:\n        # handle e.data: errors.HTTPErrorData\n        raise(e)\n    except errors.HTTPValidationError as e:\n        # handle e.data: errors.HTTPValidationErrorData\n        raise(e)\n    except errors.SDKError as e:\n        # handle exception\n        raise(e)\n```\n<!-- End Error Handling [errors] -->\n\n<!-- Start Server Selection [server] -->\n## Server Selection\n\n### Select Server by Index\n\nYou can override the default server globally by passing a server index to the `server_idx: int` optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the indexes associated with the available servers:\n\n| #   | Server                                      |\n| --- | ------------------------------------------- |\n| 0   | `https://dream-gateway.livepeer.cloud`      |\n| 1   | `https://livepeer.studio/api/beta/generate` |\n\n#### Example\n\n```python\nfrom livepeer_ai import Livepeer\n\nwith Livepeer(\n    server_idx=1,\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n\n### Override Server URL Per-Client\n\nThe default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:\n```python\nfrom livepeer_ai import Livepeer\n\nwith Livepeer(\n    server_url=\"https://dream-gateway.livepeer.cloud\",\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n<!-- End Server Selection [server] -->\n\n<!-- Start Custom HTTP Client [http-client] -->\n## Custom HTTP Client\n\nThe Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library.  In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.\nDepending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.\nThis allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.\n\nFor example, you could specify a header for every request that this sdk makes as follows:\n```python\nfrom livepeer_ai import Livepeer\nimport httpx\n\nhttp_client = httpx.Client(headers={\"x-custom-header\": \"someValue\"})\ns = Livepeer(client=http_client)\n```\n\nor you could wrap the client with your own custom logic:\n```python\nfrom livepeer_ai import Livepeer\nfrom livepeer_ai.httpclient import AsyncHttpClient\nimport httpx\n\nclass CustomClient(AsyncHttpClient):\n    client: AsyncHttpClient\n\n    def __init__(self, client: AsyncHttpClient):\n        self.client = client\n\n    async def send(\n        self,\n        request: httpx.Request,\n        *,\n        stream: bool = False,\n        auth: Union[\n            httpx._types.AuthTypes, httpx._client.UseClientDefault, None\n        ] = httpx.USE_CLIENT_DEFAULT,\n        follow_redirects: Union[\n            bool, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n    ) -> httpx.Response:\n        request.headers[\"Client-Level-Header\"] = \"added by client\"\n\n        return await self.client.send(\n            request, stream=stream, auth=auth, follow_redirects=follow_redirects\n        )\n\n    def build_request(\n        self,\n        method: str,\n        url: httpx._types.URLTypes,\n        *,\n        content: Optional[httpx._types.RequestContent] = None,\n        data: Optional[httpx._types.RequestData] = None,\n        files: Optional[httpx._types.RequestFiles] = None,\n        json: Optional[Any] = None,\n        params: Optional[httpx._types.QueryParamTypes] = None,\n        headers: Optional[httpx._types.HeaderTypes] = None,\n        cookies: Optional[httpx._types.CookieTypes] = None,\n        timeout: Union[\n            httpx._types.TimeoutTypes, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n        extensions: Optional[httpx._types.RequestExtensions] = None,\n    ) -> httpx.Request:\n        return self.client.build_request(\n            method,\n            url,\n            content=content,\n            data=data,\n            files=files,\n            json=json,\n            params=params,\n            headers=headers,\n            cookies=cookies,\n            timeout=timeout,\n            extensions=extensions,\n        )\n\ns = Livepeer(async_client=CustomClient(httpx.AsyncClient()))\n```\n<!-- End Custom HTTP Client [http-client] -->\n\n<!-- Start Authentication [security] -->\n## Authentication\n\n### Per-Client Security Schemes\n\nThis SDK supports the following security scheme globally:\n\n| Name          | Type | Scheme      |\n| ------------- | ---- | ----------- |\n| `http_bearer` | http | HTTP Bearer |\n\nTo authenticate with the API the `http_bearer` parameter must be set when initializing the SDK client instance. For example:\n```python\nfrom livepeer_ai import Livepeer\n\nwith Livepeer(\n    http_bearer=\"<YOUR_BEARER_TOKEN_HERE>\",\n) as livepeer:\n\n    res = livepeer.generate.text_to_image(request={\n        \"prompt\": \"<value>\",\n    })\n\n    assert res.image_response is not None\n\n    # Handle response\n    print(res.image_response)\n\n```\n<!-- End Authentication [security] -->\n\n<!-- Start Debugging [debug] -->\n## Debugging\n\nYou can setup your SDK to emit debug logs for SDK requests and responses.\n\nYou can pass your own logger class directly into your SDK.\n```python\nfrom livepeer_ai import Livepeer\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\ns = Livepeer(debug_logger=logging.getLogger(\"livepeer_ai\"))\n```\n<!-- End Debugging [debug] -->\n\n<!-- Start Summary [summary] -->\n## Summary\n\nLivepeer AI Runner: An application to run AI pipelines\n<!-- End Summary [summary] -->\n\n<!-- Start Table of Contents [toc] -->\n## Table of Contents\n<!-- $toc-max-depth=2 -->\n* [Livepeer AI Python Library](https://github.com/livepeer/livepeer-ai-python/blob/master/#livepeer-ai-python-library)\n  * [SDK Installation](https://github.com/livepeer/livepeer-ai-python/blob/master/#sdk-installation)\n  * [IDE Support](https://github.com/livepeer/livepeer-ai-python/blob/master/#ide-support)\n  * [SDK Example Usage](https://github.com/livepeer/livepeer-ai-python/blob/master/#sdk-example-usage)\n  * [Available Resources and Operations](https://github.com/livepeer/livepeer-ai-python/blob/master/#available-resources-and-operations)\n  * [File uploads](https://github.com/livepeer/livepeer-ai-python/blob/master/#file-uploads)\n  * [Retries](https://github.com/livepeer/livepeer-ai-python/blob/master/#retries)\n  * [Error Handling](https://github.com/livepeer/livepeer-ai-python/blob/master/#error-handling)\n  * [Server Selection](https://github.com/livepeer/livepeer-ai-python/blob/master/#server-selection)\n  * [Custom HTTP Client](https://github.com/livepeer/livepeer-ai-python/blob/master/#custom-http-client)\n  * [Authentication](https://github.com/livepeer/livepeer-ai-python/blob/master/#authentication)\n  * [Debugging](https://github.com/livepeer/livepeer-ai-python/blob/master/#debugging)\n* [Development](https://github.com/livepeer/livepeer-ai-python/blob/master/#development)\n  * [Maturity](https://github.com/livepeer/livepeer-ai-python/blob/master/#maturity)\n  * [Contributions](https://github.com/livepeer/livepeer-ai-python/blob/master/#contributions)\n\n<!-- End Table of Contents [toc] -->\n\n<!-- Placeholder for Future Speakeasy SDK Sections -->\n\n# Development\n\n## Maturity\n\nThis SDK is in **alpha**, and there may be breaking changes between versions without a major version update. Therefore, we recommend pinning usage\nto a specific package version. This way, you can install the same version each time without breaking changes unless you are intentionally\nlooking for the latest version.\n\n## Contributions\n\nWhile we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. We look forward to hearing your feedback. Feel free to open a [PR](https://github.com/livepeer/livepeer-ai-python/compare) or [an issue](https://github.com/livepeer/livepeer-ai-python/issues) with a proof of concept and we'll do our best to include it in a future release.\n\n### SDK Created by [Speakeasy](https://www.speakeasy.com/?utm_source=livepeer-ai&utm_campaign=python)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python Client SDK for the Livepeer AI API.",
    "version": "0.9.0",
    "project_urls": {
        "Homepage": "https://github.com/livepeer/livepeer-ai-python.git",
        "Repository": "https://github.com/livepeer/livepeer-ai-python.git"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d65a314d46746e46ace33e175632ca1b3eb1af34e26d6bde471bc25861f28346",
                "md5": "03a575afe6fa4e91bd94b870ac8d4dc7",
                "sha256": "92fa71e27ea3bd526b76e751ea9f7d263ad71ce2d4bb7a14aa3adf08e1ee453f"
            },
            "downloads": -1,
            "filename": "livepeer_ai-0.9.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "03a575afe6fa4e91bd94b870ac8d4dc7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 62914,
            "upload_time": "2024-12-31T09:11:39",
            "upload_time_iso_8601": "2024-12-31T09:11:39.216861Z",
            "url": "https://files.pythonhosted.org/packages/d6/5a/314d46746e46ace33e175632ca1b3eb1af34e26d6bde471bc25861f28346/livepeer_ai-0.9.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "98adf1b10b79f145df24ab75209ec6f13137f5e4fc3fc43687b74ff65ccc28e3",
                "md5": "13779dd66b5cde208c043990e5ccd27d",
                "sha256": "62202093671d17872692adcadc55a19e48eb1a450853ae4cb35505dd72ee05b6"
            },
            "downloads": -1,
            "filename": "livepeer_ai-0.9.0.tar.gz",
            "has_sig": false,
            "md5_digest": "13779dd66b5cde208c043990e5ccd27d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 39120,
            "upload_time": "2024-12-31T09:11:40",
            "upload_time_iso_8601": "2024-12-31T09:11:40.379841Z",
            "url": "https://files.pythonhosted.org/packages/98/ad/f1b10b79f145df24ab75209ec6f13137f5e4fc3fc43687b74ff65ccc28e3/livepeer_ai-0.9.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-31 09:11:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "livepeer",
    "github_project": "livepeer-ai-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "livepeer-ai"
}
        
Elapsed time: 0.49323s