# Friendli Python SDK
<p align="center">
<img width="10%" alt="Friendli Logo" src="https://friendli.ai/icon.svg">
</p>
<h2><p align="center">Supercharge Generative AI Serving with Friendli 🚀</p></h2>
## Token Setup
When using Friendli Python SDK, you need to provide a Friendli Token for authentication and authorization purposes. A Friendli Token serves as an alternative method of authorization to signing in with an email and a password. You can generate a new Friendli Token through the [Friendli Suite](https://suite.friendli.ai), at your "Personal settings" page by following the steps below.
1. Go to the [Friendli Suite](https://suite.friendli.ai) and sign in with your account.
2. Click the profile icon at the top-right corner of the page.
3. Click "Personal settings" menu.
4. Go to the "Tokens" tab on the navigation bar.
5. Create a new Friendli Token by clicking the "Create token" button.
6. Copy the token and save it in a safe place. You will not be able to see this token again once the page is refreshed.
<!-- No Summary [summary] -->
## Table of Contents
* [SDK Installation](https://github.com/friendliai/friendli-python/blob/master/#sdk-installation)
* [SDK Example Usage](https://github.com/friendliai/friendli-python/blob/master/#sdk-example-usage)
* [Available Resources and Operations](https://github.com/friendliai/friendli-python/blob/master/#available-resources-and-operations)
* [Server-sent event streaming](https://github.com/friendliai/friendli-python/blob/master/#server-sent-event-streaming)
* [Retries](https://github.com/friendliai/friendli-python/blob/master/#retries)
* [Error Handling](https://github.com/friendliai/friendli-python/blob/master/#error-handling)
* [Server Selection](https://github.com/friendliai/friendli-python/blob/master/#server-selection)
* [Custom HTTP Client](https://github.com/friendliai/friendli-python/blob/master/#custom-http-client)
* [Debugging](https://github.com/friendliai/friendli-python/blob/master/#debugging)
* [IDE Support](https://github.com/friendliai/friendli-python/blob/master/#ide-support)
<!-- No Table of Contents [toc] -->
<!-- Start SDK Installation [installation] -->
## SDK Installation
The SDK can be installed with either *pip* or *poetry* package managers.
### PIP
*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.
```bash
pip install friendli
```
### Poetry
*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.
```bash
poetry add friendli
```
<!-- End SDK Installation [installation] -->
<!-- Start SDK Example Usage [usage] -->
## SDK Example Usage
### Chat completions
Given a list of messages forming a conversation, the model generates a response.
```python
# Synchronous Example
from friendli import SyncFriendli
import os
with SyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
# Handle response
print(res)
```
</br>
The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from friendli import AsyncFriendli
import os
async def main():
async with AsyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = await friendli.serverless.chat.complete_async(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
# Handle response
print(res)
asyncio.run(main())
```
### Tool assisted chat completions
Given a list of messages forming a conversation, the model generates a response. Additionally, the model can utilize built-in tools for tool calls, enhancing its capability to provide more comprehensive and actionable responses.
```python
# Synchronous Example
from friendli import SyncFriendli
import os
with SyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.tool_assisted_chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "user",
"content": "What is 3 + 6?",
},
],
max_tokens=200,
tools=[
{
"type": "math:calculator",
},
],
)
# Handle response
print(res)
```
</br>
The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from friendli import AsyncFriendli
import os
async def main():
async with AsyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = await friendli.serverless.tool_assisted_chat.complete_async(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "user",
"content": "What is 3 + 6?",
},
],
max_tokens=200,
tools=[
{
"type": "math:calculator",
},
],
)
# Handle response
print(res)
asyncio.run(main())
```
<!-- End SDK Example Usage [usage] -->
<!-- Start Available Resources and Operations [operations] -->
## Available Resources and Operations
<details open>
<summary>Available methods</summary>
### [dedicated](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/dedicated/README.md)
#### [dedicated.chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md)
* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md#complete) - Chat completions
* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md#stream) - Stream chat completions
#### [dedicated.completions](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md)
* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md#complete) - Completions
* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md#stream) - Stream completions
#### [dedicated.token](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md)
* [tokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md#tokenization) - Tokenization
* [detokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md#detokenization) - Detokenization
### [serverless](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/serverless/README.md)
#### [serverless.chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md)
* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md#complete) - Chat completions
* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md#stream) - Stream chat completions
#### [serverless.completions](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md)
* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md#complete) - Completions
* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md#stream) - Stream completions
#### [serverless.rag](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/rag/README.md)
* [knowledge_retrieve](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/rag/README.md#knowledge_retrieve) - Knowledge retrieve
#### [serverless.token](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md)
* [tokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md#tokenization) - Tokenization
* [detokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md#detokenization) - Detokenization
#### [serverless.tool_assisted_chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md)
* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md#complete) - Tool assisted chat completions
* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md#stream) - Stream tool assisted chat completions
</details>
<!-- End Available Resources and Operations [operations] -->
<!-- Start Server-sent event streaming [eventstream] -->
## Server-sent event streaming
[Server-sent events][mdn-sse] are used to stream content from certain
operations. These operations will expose the stream as [Generator][generator] that
can be consumed using a simple `for` loop. The loop will
terminate when the server no longer has any events to send and closes the
underlying connection.
The stream is also a [Context Manager][context-manager] and can be used with the `with` statement and will close the
underlying connection when the context is exited.
```python
from friendli import SyncFriendli
import os
with SyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.chat.stream(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
with res as event_stream:
for event in event_stream:
# handle event
print(event, flush=True)
```
[mdn-sse]: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
[generator]: https://book.pythontips.com/en/latest/generators.html
[context-manager]: https://book.pythontips.com/en/latest/context_managers.html
<!-- End Server-sent event streaming [eventstream] -->
<!-- Start Retries [retries] -->
## Retries
Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.
To change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:
```python
from friendli import SyncFriendli
from friendli.utils import BackoffStrategy, RetryConfig
import os
with SyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
retries=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
)
# Handle response
print(res)
```
If you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:
```python
from friendli import SyncFriendli
from friendli.utils import BackoffStrategy, RetryConfig
import os
with SyncFriendli(
retry_config=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
# Handle response
print(res)
```
<!-- End Retries [retries] -->
<!-- Start Error Handling [errors] -->
## Error Handling
Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.
By default, an API error will raise a models.SDKError exception, which has the following properties:
| Property | Type | Description |
|-----------------|------------------|-----------------------|
| `.status_code` | *int* | The HTTP status code |
| `.message` | *str* | The error message |
| `.raw_response` | *httpx.Response* | The raw HTTP response |
| `.body` | *str* | The response content |
When custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `complete_async` method may raise the following exceptions:
| Error Type | Status Code | Content Type |
| --------------- | ----------- | ------------ |
| models.SDKError | 4XX, 5XX | \*/\* |
### Example
```python
from friendli import SyncFriendli, models
import os
with SyncFriendli(
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = None
try:
res = friendli.serverless.chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
# Handle response
print(res)
except models.SDKError as e:
# handle exception
raise (e)
```
<!-- End Error Handling [errors] -->
<!-- Start Server Selection [server] -->
## Server Selection
### Override Server URL Per-Client
The default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:
```python
from friendli import SyncFriendli
import os
with SyncFriendli(
server_url="https://api.friendli.ai",
token=os.getenv("FRIENDLI_TOKEN", ""),
) as friendli:
res = friendli.serverless.chat.complete(
model="meta-llama-3.1-8b-instruct",
messages=[
{
"role": "system",
"content": "You are a helpful assistant.",
},
{
"role": "user",
"content": "Hello!",
},
],
max_tokens=200,
)
# Handle response
print(res)
```
<!-- End Server Selection [server] -->
<!-- Start Custom HTTP Client [http-client] -->
## Custom HTTP Client
The Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.
Depending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.
This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.
For example, you could specify a header for every request that this sdk makes as follows:
```python
from friendli import SyncFriendli
import httpx
http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = SyncFriendli(client=http_client)
```
or you could wrap the client with your own custom logic:
```python
from friendli import AsyncFriendli
from friendli.httpclient import AsyncHttpClient
import httpx
from typing import Any, Optional, Union
class CustomClient(AsyncHttpClient):
client: AsyncHttpClient
def __init__(self, client: AsyncHttpClient):
self.client = client
async def send(
self,
request: httpx.Request,
*,
stream: bool = False,
auth: Union[
httpx._types.AuthTypes, httpx._client.UseClientDefault, None
] = httpx.USE_CLIENT_DEFAULT,
follow_redirects: Union[
bool, httpx._client.UseClientDefault
] = httpx.USE_CLIENT_DEFAULT,
) -> httpx.Response:
request.headers["Client-Level-Header"] = "added by client"
return await self.client.send(
request, stream=stream, auth=auth, follow_redirects=follow_redirects
)
def build_request(
self,
method: str,
url: httpx._types.URLTypes,
*,
content: Optional[httpx._types.RequestContent] = None,
data: Optional[httpx._types.RequestData] = None,
files: Optional[httpx._types.RequestFiles] = None,
json: Optional[Any] = None,
params: Optional[httpx._types.QueryParamTypes] = None,
headers: Optional[httpx._types.HeaderTypes] = None,
cookies: Optional[httpx._types.CookieTypes] = None,
timeout: Union[
httpx._types.TimeoutTypes, httpx._client.UseClientDefault
] = httpx.USE_CLIENT_DEFAULT,
extensions: Optional[httpx._types.RequestExtensions] = None,
) -> httpx.Request:
return self.client.build_request(
method,
url,
content=content,
data=data,
files=files,
json=json,
params=params,
headers=headers,
cookies=cookies,
timeout=timeout,
extensions=extensions,
)
s = AsyncFriendli(async_client=CustomClient(httpx.AsyncClient()))
```
<!-- End Custom HTTP Client [http-client] -->
<!-- No Authentication [security] -->
<!-- Start Debugging [debug] -->
## Debugging
You can setup your SDK to emit debug logs for SDK requests and responses.
You can pass your own logger class directly into your SDK.
```python
from friendli import SyncFriendli
import logging
logging.basicConfig(level=logging.DEBUG)
s = SyncFriendli(debug_logger=logging.getLogger("friendli"))
```
You can also enable a default debug logger by setting an environment variable `FRIENDLI_DEBUG` to true.
<!-- End Debugging [debug] -->
<!-- Start IDE Support [idesupport] -->
## IDE Support
### PyCharm
Generally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.
- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)
<!-- End IDE Support [idesupport] -->
<!-- Placeholder for Future Speakeasy SDK Sections -->
Raw data
{
"_id": null,
"home_page": "https://github.com/friendliai/friendli-python.git",
"name": "friendli",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Speakeasy",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/ab/ff/b5d7586743e2c8f99b2f9b5ef199176bfce0f7512664a38ecd2daa61c59e/friendli-0.4.3.tar.gz",
"platform": null,
"description": "# Friendli Python SDK\n\n<p align=\"center\">\n <img width=\"10%\" alt=\"Friendli Logo\" src=\"https://friendli.ai/icon.svg\">\n</p>\n\n<h2><p align=\"center\">Supercharge Generative AI Serving with Friendli \ud83d\ude80</p></h2>\n\n## Token Setup\n\nWhen using Friendli Python SDK, you need to provide a Friendli Token for authentication and authorization purposes. A Friendli Token serves as an alternative method of authorization to signing in with an email and a password. You can generate a new Friendli Token through the [Friendli Suite](https://suite.friendli.ai), at your \"Personal settings\" page by following the steps below.\n\n1. Go to the [Friendli Suite](https://suite.friendli.ai) and sign in with your account.\n2. Click the profile icon at the top-right corner of the page.\n3. Click \"Personal settings\" menu.\n4. Go to the \"Tokens\" tab on the navigation bar.\n5. Create a new Friendli Token by clicking the \"Create token\" button.\n6. Copy the token and save it in a safe place. You will not be able to see this token again once the page is refreshed.\n\n<!-- No Summary [summary] -->\n\n## Table of Contents\n\n* [SDK Installation](https://github.com/friendliai/friendli-python/blob/master/#sdk-installation)\n* [SDK Example Usage](https://github.com/friendliai/friendli-python/blob/master/#sdk-example-usage)\n* [Available Resources and Operations](https://github.com/friendliai/friendli-python/blob/master/#available-resources-and-operations)\n* [Server-sent event streaming](https://github.com/friendliai/friendli-python/blob/master/#server-sent-event-streaming)\n* [Retries](https://github.com/friendliai/friendli-python/blob/master/#retries)\n* [Error Handling](https://github.com/friendliai/friendli-python/blob/master/#error-handling)\n* [Server Selection](https://github.com/friendliai/friendli-python/blob/master/#server-selection)\n* [Custom HTTP Client](https://github.com/friendliai/friendli-python/blob/master/#custom-http-client)\n* [Debugging](https://github.com/friendliai/friendli-python/blob/master/#debugging)\n* [IDE Support](https://github.com/friendliai/friendli-python/blob/master/#ide-support)\n<!-- No Table of Contents [toc] -->\n\n<!-- Start SDK Installation [installation] -->\n## SDK Installation\n\nThe SDK can be installed with either *pip* or *poetry* package managers.\n\n### PIP\n\n*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.\n\n```bash\npip install friendli\n```\n\n### Poetry\n\n*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.\n\n```bash\npoetry add friendli\n```\n<!-- End SDK Installation [installation] -->\n\n<!-- Start SDK Example Usage [usage] -->\n## SDK Example Usage\n\n### Chat completions\n\nGiven a list of messages forming a conversation, the model generates a response.\n\n```python\n# Synchronous Example\nfrom friendli import SyncFriendli\nimport os\n\nwith SyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n # Handle response\n print(res)\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom friendli import AsyncFriendli\nimport os\n\n\nasync def main():\n async with AsyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n ) as friendli:\n\n res = await friendli.serverless.chat.complete_async(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n # Handle response\n print(res)\n\nasyncio.run(main())\n```\n\n### Tool assisted chat completions\n\nGiven a list of messages forming a conversation, the model generates a response. Additionally, the model can utilize built-in tools for tool calls, enhancing its capability to provide more comprehensive and actionable responses.\n\n```python\n# Synchronous Example\nfrom friendli import SyncFriendli\nimport os\n\nwith SyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.tool_assisted_chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"What is 3 + 6?\",\n },\n ],\n max_tokens=200,\n tools=[\n {\n \"type\": \"math:calculator\",\n },\n ],\n )\n\n # Handle response\n print(res)\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom friendli import AsyncFriendli\nimport os\n\n\nasync def main():\n async with AsyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n ) as friendli:\n\n res = await friendli.serverless.tool_assisted_chat.complete_async(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"What is 3 + 6?\",\n },\n ],\n max_tokens=200,\n tools=[\n {\n \"type\": \"math:calculator\",\n },\n ],\n )\n\n # Handle response\n print(res)\n\nasyncio.run(main())\n```\n<!-- End SDK Example Usage [usage] -->\n\n<!-- Start Available Resources and Operations [operations] -->\n## Available Resources and Operations\n\n<details open>\n<summary>Available methods</summary>\n\n### [dedicated](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/dedicated/README.md)\n\n\n#### [dedicated.chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md)\n\n* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md#complete) - Chat completions\n* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlichat/README.md#stream) - Stream chat completions\n\n#### [dedicated.completions](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md)\n\n* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md#complete) - Completions\n* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlicompletions/README.md#stream) - Stream completions\n\n#### [dedicated.token](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md)\n\n* [tokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md#tokenization) - Tokenization\n* [detokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/friendlitoken/README.md#detokenization) - Detokenization\n\n\n### [serverless](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/serverless/README.md)\n\n\n#### [serverless.chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md)\n\n* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md#complete) - Chat completions\n* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/chat/README.md#stream) - Stream chat completions\n\n#### [serverless.completions](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md)\n\n* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md#complete) - Completions\n* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/completions/README.md#stream) - Stream completions\n\n#### [serverless.rag](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/rag/README.md)\n\n* [knowledge_retrieve](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/rag/README.md#knowledge_retrieve) - Knowledge retrieve\n\n#### [serverless.token](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md)\n\n* [tokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md#tokenization) - Tokenization\n* [detokenization](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/token/README.md#detokenization) - Detokenization\n\n#### [serverless.tool_assisted_chat](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md)\n\n* [complete](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md#complete) - Tool assisted chat completions\n* [stream](https://github.com/friendliai/friendli-python/blob/master/docs/sdks/toolassistedchat/README.md#stream) - Stream tool assisted chat completions\n\n</details>\n<!-- End Available Resources and Operations [operations] -->\n\n<!-- Start Server-sent event streaming [eventstream] -->\n## Server-sent event streaming\n\n[Server-sent events][mdn-sse] are used to stream content from certain\noperations. These operations will expose the stream as [Generator][generator] that\ncan be consumed using a simple `for` loop. The loop will\nterminate when the server no longer has any events to send and closes the\nunderlying connection. \n\nThe stream is also a [Context Manager][context-manager] and can be used with the `with` statement and will close the\nunderlying connection when the context is exited.\n\n```python\nfrom friendli import SyncFriendli\nimport os\n\nwith SyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.chat.stream(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n with res as event_stream:\n for event in event_stream:\n # handle event\n print(event, flush=True)\n```\n\n[mdn-sse]: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events\n[generator]: https://book.pythontips.com/en/latest/generators.html\n[context-manager]: https://book.pythontips.com/en/latest/context_managers.html\n<!-- End Server-sent event streaming [eventstream] -->\n\n<!-- Start Retries [retries] -->\n## Retries\n\nSome of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.\n\nTo change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:\n```python\nfrom friendli import SyncFriendli\nfrom friendli.utils import BackoffStrategy, RetryConfig\nimport os\n\nwith SyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n retries=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n )\n\n # Handle response\n print(res)\n```\n\nIf you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:\n```python\nfrom friendli import SyncFriendli\nfrom friendli.utils import BackoffStrategy, RetryConfig\nimport os\n\nwith SyncFriendli(\n retry_config=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n # Handle response\n print(res)\n```\n<!-- End Retries [retries] -->\n\n<!-- Start Error Handling [errors] -->\n## Error Handling\n\nHandling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.\n\nBy default, an API error will raise a models.SDKError exception, which has the following properties:\n\n| Property | Type | Description |\n|-----------------|------------------|-----------------------|\n| `.status_code` | *int* | The HTTP status code |\n| `.message` | *str* | The error message |\n| `.raw_response` | *httpx.Response* | The raw HTTP response |\n| `.body` | *str* | The response content |\n\nWhen custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `complete_async` method may raise the following exceptions:\n\n| Error Type | Status Code | Content Type |\n| --------------- | ----------- | ------------ |\n| models.SDKError | 4XX, 5XX | \\*/\\* |\n\n### Example\n\n```python\nfrom friendli import SyncFriendli, models\nimport os\n\nwith SyncFriendli(\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n res = None\n try:\n\n res = friendli.serverless.chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n # Handle response\n print(res)\n\n except models.SDKError as e:\n # handle exception\n raise (e)\n```\n<!-- End Error Handling [errors] -->\n\n<!-- Start Server Selection [server] -->\n## Server Selection\n\n### Override Server URL Per-Client\n\nThe default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:\n```python\nfrom friendli import SyncFriendli\nimport os\n\nwith SyncFriendli(\n server_url=\"https://api.friendli.ai\",\n token=os.getenv(\"FRIENDLI_TOKEN\", \"\"),\n) as friendli:\n\n res = friendli.serverless.chat.complete(\n model=\"meta-llama-3.1-8b-instruct\",\n messages=[\n {\n \"role\": \"system\",\n \"content\": \"You are a helpful assistant.\",\n },\n {\n \"role\": \"user\",\n \"content\": \"Hello!\",\n },\n ],\n max_tokens=200,\n )\n\n # Handle response\n print(res)\n```\n<!-- End Server Selection [server] -->\n\n<!-- Start Custom HTTP Client [http-client] -->\n## Custom HTTP Client\n\nThe Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library. In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.\nDepending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.\nThis allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.\n\nFor example, you could specify a header for every request that this sdk makes as follows:\n```python\nfrom friendli import SyncFriendli\nimport httpx\n\nhttp_client = httpx.Client(headers={\"x-custom-header\": \"someValue\"})\ns = SyncFriendli(client=http_client)\n```\n\nor you could wrap the client with your own custom logic:\n```python\nfrom friendli import AsyncFriendli\nfrom friendli.httpclient import AsyncHttpClient\nimport httpx\nfrom typing import Any, Optional, Union\n\n\nclass CustomClient(AsyncHttpClient):\n client: AsyncHttpClient\n\n def __init__(self, client: AsyncHttpClient):\n self.client = client\n\n async def send(\n self,\n request: httpx.Request,\n *,\n stream: bool = False,\n auth: Union[\n httpx._types.AuthTypes, httpx._client.UseClientDefault, None\n ] = httpx.USE_CLIENT_DEFAULT,\n follow_redirects: Union[\n bool, httpx._client.UseClientDefault\n ] = httpx.USE_CLIENT_DEFAULT,\n ) -> httpx.Response:\n request.headers[\"Client-Level-Header\"] = \"added by client\"\n\n return await self.client.send(\n request, stream=stream, auth=auth, follow_redirects=follow_redirects\n )\n\n def build_request(\n self,\n method: str,\n url: httpx._types.URLTypes,\n *,\n content: Optional[httpx._types.RequestContent] = None,\n data: Optional[httpx._types.RequestData] = None,\n files: Optional[httpx._types.RequestFiles] = None,\n json: Optional[Any] = None,\n params: Optional[httpx._types.QueryParamTypes] = None,\n headers: Optional[httpx._types.HeaderTypes] = None,\n cookies: Optional[httpx._types.CookieTypes] = None,\n timeout: Union[\n httpx._types.TimeoutTypes, httpx._client.UseClientDefault\n ] = httpx.USE_CLIENT_DEFAULT,\n extensions: Optional[httpx._types.RequestExtensions] = None,\n ) -> httpx.Request:\n return self.client.build_request(\n method,\n url,\n content=content,\n data=data,\n files=files,\n json=json,\n params=params,\n headers=headers,\n cookies=cookies,\n timeout=timeout,\n extensions=extensions,\n )\n\n\ns = AsyncFriendli(async_client=CustomClient(httpx.AsyncClient()))\n```\n<!-- End Custom HTTP Client [http-client] -->\n\n<!-- No Authentication [security] -->\n\n<!-- Start Debugging [debug] -->\n## Debugging\n\nYou can setup your SDK to emit debug logs for SDK requests and responses.\n\nYou can pass your own logger class directly into your SDK.\n```python\nfrom friendli import SyncFriendli\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\ns = SyncFriendli(debug_logger=logging.getLogger(\"friendli\"))\n```\n\nYou can also enable a default debug logger by setting an environment variable `FRIENDLI_DEBUG` to true.\n<!-- End Debugging [debug] -->\n\n<!-- Start IDE Support [idesupport] -->\n## IDE Support\n\n### PyCharm\n\nGenerally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.\n\n- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)\n<!-- End IDE Support [idesupport] -->\n\n<!-- Placeholder for Future Speakeasy SDK Sections -->\n",
"bugtrack_url": null,
"license": null,
"summary": "Python Client SDK Generated by Speakeasy.",
"version": "0.4.3",
"project_urls": {
"Homepage": "https://github.com/friendliai/friendli-python.git",
"Repository": "https://github.com/friendliai/friendli-python.git"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "06c086e937f92e340cbd3b9cd3ff70adb691563db334a5015a22ec1904bdfe82",
"md5": "5256455bb7edfcc5f79c200611579eae",
"sha256": "edf84292f6c768141e3489d1c3938cf06c446ede29432c5c9046c4facf6259c5"
},
"downloads": -1,
"filename": "friendli-0.4.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5256455bb7edfcc5f79c200611579eae",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8",
"size": 158176,
"upload_time": "2024-12-30T06:40:21",
"upload_time_iso_8601": "2024-12-30T06:40:21.835012Z",
"url": "https://files.pythonhosted.org/packages/06/c0/86e937f92e340cbd3b9cd3ff70adb691563db334a5015a22ec1904bdfe82/friendli-0.4.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "abffb5d7586743e2c8f99b2f9b5ef199176bfce0f7512664a38ecd2daa61c59e",
"md5": "40d50ff145163cafe60c2743d2e21b9c",
"sha256": "7f47ab9e94744afe7406f78ec53bbb946366e4f6511ab89fc5c08d3d51f3bc19"
},
"downloads": -1,
"filename": "friendli-0.4.3.tar.gz",
"has_sig": false,
"md5_digest": "40d50ff145163cafe60c2743d2e21b9c",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8",
"size": 87233,
"upload_time": "2024-12-30T06:40:23",
"upload_time_iso_8601": "2024-12-30T06:40:23.025673Z",
"url": "https://files.pythonhosted.org/packages/ab/ff/b5d7586743e2c8f99b2f9b5ef199176bfce0f7512664a38ecd2daa61c59e/friendli-0.4.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-30 06:40:23",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "friendliai",
"github_project": "friendli-python",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "friendli"
}