mistralai


Namemistralai JSON
Version 1.2.2 PyPI version JSON
download
home_pagehttps://github.com/mistralai/client-python.git
SummaryPython Client SDK for the Mistral AI API.
upload_time2024-11-12 18:29:22
maintainerNone
docs_urlNone
authorMistral
requires_python<4.0,>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Mistral Python Client

## Migration warning
 
This documentation is for Mistral AI SDK v1. You can find more details on how to migrate from v0 to v1 [here](https://github.com/mistralai/client-python/blob/master/MIGRATION.md)

## API Key Setup

Before you begin, you will need a Mistral AI API key.

1. Get your own Mistral API Key: <https://docs.mistral.ai/#api-access>
2. Set your Mistral API Key as an environment variable. You only need to do this once.

```bash
# set Mistral API Key (using zsh for example)
$ echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~/.zshenv

# reload the environment (or just quit and open a new terminal)
$ source ~/.zshenv
```

<!-- Start Summary [summary] -->
## Summary

Mistral AI API: Our Chat Completion and Embeddings APIs specification. Create your account on [La Plateforme](https://console.mistral.ai) to get access and read the [docs](https://docs.mistral.ai) to learn how to use it.
<!-- End Summary [summary] -->

<!-- Start Table of Contents [toc] -->
## Table of Contents

* [SDK Installation](https://github.com/mistralai/client-python/blob/master/#sdk-installation)
* [IDE Support](https://github.com/mistralai/client-python/blob/master/#ide-support)
* [SDK Example Usage](https://github.com/mistralai/client-python/blob/master/#sdk-example-usage)
* [Available Resources and Operations](https://github.com/mistralai/client-python/blob/master/#available-resources-and-operations)
* [Server-sent event streaming](https://github.com/mistralai/client-python/blob/master/#server-sent-event-streaming)
* [File uploads](https://github.com/mistralai/client-python/blob/master/#file-uploads)
* [Retries](https://github.com/mistralai/client-python/blob/master/#retries)
* [Error Handling](https://github.com/mistralai/client-python/blob/master/#error-handling)
* [Server Selection](https://github.com/mistralai/client-python/blob/master/#server-selection)
* [Custom HTTP Client](https://github.com/mistralai/client-python/blob/master/#custom-http-client)
* [Authentication](https://github.com/mistralai/client-python/blob/master/#authentication)
* [Debugging](https://github.com/mistralai/client-python/blob/master/#debugging)
<!-- End Table of Contents [toc] -->

<!-- Start SDK Installation [installation] -->
## SDK Installation

The SDK can be installed with either *pip* or *poetry* package managers.

### PIP

*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.

```bash
pip install mistralai
```

### Poetry

*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.

```bash
poetry add mistralai
```
<!-- End SDK Installation [installation] -->

<!-- Start SDK Example Usage [usage] -->
## SDK Example Usage

### Create Chat Completions

This example shows how to create chat completions.

```python
# Synchronous Example
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.chat.complete(model="mistral-small-latest", messages=[
    {
        "content": "Who is the best French painter? Answer in one short sentence.",
        "role": "user",
    },
])

if res is not None:
    # handle response
    pass
```

</br>

The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from mistralai import Mistral
import os

async def main():
    s = Mistral(
        api_key=os.getenv("MISTRAL_API_KEY", ""),
    )
    res = await s.chat.complete_async(model="mistral-small-latest", messages=[
        {
            "content": "Who is the best French painter? Answer in one short sentence.",
            "role": "user",
        },
    ])
    if res is not None:
        # handle response
        pass

asyncio.run(main())
```

### Upload a file

This example shows how to upload a file.

```python
# Synchronous Example
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.files.upload(file={
    "file_name": "example.file",
    "content": open("example.file", "rb"),
})

if res is not None:
    # handle response
    pass
```

</br>

The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from mistralai import Mistral
import os

async def main():
    s = Mistral(
        api_key=os.getenv("MISTRAL_API_KEY", ""),
    )
    res = await s.files.upload_async(file={
        "file_name": "example.file",
        "content": open("example.file", "rb"),
    })
    if res is not None:
        # handle response
        pass

asyncio.run(main())
```

### Create Agents Completions

This example shows how to create agents completions.

```python
# Synchronous Example
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.agents.complete(messages=[
    {
        "content": "Who is the best French painter? Answer in one short sentence.",
        "role": "user",
    },
], agent_id="<value>")

if res is not None:
    # handle response
    pass
```

</br>

The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from mistralai import Mistral
import os

async def main():
    s = Mistral(
        api_key=os.getenv("MISTRAL_API_KEY", ""),
    )
    res = await s.agents.complete_async(messages=[
        {
            "content": "Who is the best French painter? Answer in one short sentence.",
            "role": "user",
        },
    ], agent_id="<value>")
    if res is not None:
        # handle response
        pass

asyncio.run(main())
```

### Create Embedding Request

This example shows how to create embedding request.

```python
# Synchronous Example
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.embeddings.create(inputs=[
    "Embed this sentence.",
    "As well as this one.",
], model="Wrangler")

if res is not None:
    # handle response
    pass
```

</br>

The same SDK client can also be used to make asychronous requests by importing asyncio.
```python
# Asynchronous Example
import asyncio
from mistralai import Mistral
import os

async def main():
    s = Mistral(
        api_key=os.getenv("MISTRAL_API_KEY", ""),
    )
    res = await s.embeddings.create_async(inputs=[
        "Embed this sentence.",
        "As well as this one.",
    ], model="Wrangler")
    if res is not None:
        # handle response
        pass

asyncio.run(main())
```
<!-- End SDK Example Usage [usage] -->


### More examples

You can run the examples in the `examples/` directory using `poetry run` or by entering the virtual environment using `poetry shell`.


## Providers' SDKs Example Usage

### Azure AI

**Prerequisites**

Before you begin, ensure you have `AZUREAI_ENDPOINT` and an `AZURE_API_KEY`. To obtain these, you will need to deploy Mistral on Azure AI.
See [instructions for deploying Mistral on Azure AI here](https://docs.mistral.ai/deployment/cloud/azure/).

Here's a basic example to get you started. You can also run [the example in the `examples` directory](https://github.com/mistralai/client-python/blob/master//examples/azure).

```python
import asyncio
import os

from mistralai_azure import MistralAzure

client = MistralAzure(
    azure_api_key=os.getenv("AZURE_API_KEY", ""),
    azure_endpoint=os.getenv("AZURE_ENDPOINT", "")
)

async def main() -> None:
    res = await client.chat.complete_async( 
        max_tokens= 100,
        temperature= 0.5,
        messages= [
            {
                "content": "Hello there!",
                "role": "user"
            }
        ]
    )
    print(res)

asyncio.run(main())
```
The documentation for the Azure SDK is available [here](https://github.com/mistralai/client-python/blob/master/packages/mistralai_azure/README.md).

### Google Cloud


**Prerequisites**

Before you begin, you will need to create a Google Cloud project and enable the Mistral API. To do this, follow the instructions [here](https://docs.mistral.ai/deployment/cloud/vertex/).

To run this locally you will also need to ensure you are authenticated with Google Cloud. You can do this by running

```bash
gcloud auth application-default login
```

**Step 1: Install**

Install the extras dependencies specific to Google Cloud:

```bash
pip install mistralai[gcp]
```

**Step 2: Example Usage**

Here's a basic example to get you started.

```python
import asyncio
from mistralai_gcp import MistralGoogleCloud

client = MistralGoogleCloud()


async def main() -> None:
    res = await client.chat.complete_async(
        model= "mistral-small-2402",
        messages= [
            {
                "content": "Hello there!",
                "role": "user"
            }
        ]
    )
    print(res)

asyncio.run(main())
```

The documentation for the GCP SDK is available [here](https://github.com/mistralai/client-python/blob/master/packages/mistralai_gcp/README.md).


<!-- Start Available Resources and Operations [operations] -->
## Available Resources and Operations

<details open>
<summary>Available methods</summary>

### [agents](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md)

* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md#complete) - Agents Completion
* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md#stream) - Stream Agents completion

### [batch](https://github.com/mistralai/client-python/blob/master/docs/sdks/batch/README.md)


#### [batch.jobs](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md)

* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#list) - Get Batch Jobs
* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#create) - Create Batch Job
* [get](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#get) - Get Batch Job
* [cancel](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#cancel) - Cancel Batch Job

### [chat](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md)

* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md#complete) - Chat Completion
* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md#stream) - Stream chat completion

### [classifiers](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md)

* [moderate](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md#moderate) - Moderations
* [moderate_chat](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md#moderate_chat) - Moderations Chat

### [embeddings](https://github.com/mistralai/client-python/blob/master/docs/sdks/embeddings/README.md)

* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/embeddings/README.md#create) - Embeddings

### [files](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md)

* [upload](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#upload) - Upload File
* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#list) - List Files
* [retrieve](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#retrieve) - Retrieve File
* [delete](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#delete) - Delete File
* [download](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#download) - Download File

### [fim](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md)

* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md#complete) - Fim Completion
* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md#stream) - Stream fim completion

### [fine_tuning](https://github.com/mistralai/client-python/blob/master/docs/sdks/finetuning/README.md)


#### [fine_tuning.jobs](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md)

* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#list) - Get Fine Tuning Jobs
* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#create) - Create Fine Tuning Job
* [get](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#get) - Get Fine Tuning Job
* [cancel](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#cancel) - Cancel Fine Tuning Job
* [start](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#start) - Start Fine Tuning Job


### [models](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md)

* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#list) - List Models
* [retrieve](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#retrieve) - Retrieve Model
* [delete](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#delete) - Delete Model
* [update](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#update) - Update Fine Tuned Model
* [archive](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#archive) - Archive Fine Tuned Model
* [unarchive](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#unarchive) - Unarchive Fine Tuned Model

</details>
<!-- End Available Resources and Operations [operations] -->

<!-- Start Server-sent event streaming [eventstream] -->
## Server-sent event streaming

[Server-sent events][mdn-sse] are used to stream content from certain
operations. These operations will expose the stream as [Generator][generator] that
can be consumed using a simple `for` loop. The loop will
terminate when the server no longer has any events to send and closes the
underlying connection.

```python
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.chat.stream(model="mistral-small-latest", messages=[
    {
        "content": "Who is the best French painter? Answer in one short sentence.",
        "role": "user",
    },
])

if res is not None:
    for event in res:
        # handle event
        print(event, flush=True)

```

[mdn-sse]: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events
[generator]: https://wiki.python.org/moin/Generators
<!-- End Server-sent event streaming [eventstream] -->

<!-- Start File uploads [file-upload] -->
## File uploads

Certain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.

> [!TIP]
>
> For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.
>

```python
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.files.upload(file={
    "file_name": "example.file",
    "content": open("example.file", "rb"),
})

if res is not None:
    # handle response
    pass

```
<!-- End File uploads [file-upload] -->

<!-- Start Retries [retries] -->
## Retries

Some of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.

To change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:
```python
from mistral.utils import BackoffStrategy, RetryConfig
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.models.list(,
    RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False))

if res is not None:
    # handle response
    pass

```

If you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:
```python
from mistral.utils import BackoffStrategy, RetryConfig
from mistralai import Mistral
import os

s = Mistral(
    retry_config=RetryConfig("backoff", BackoffStrategy(1, 50, 1.1, 100), False),
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.models.list()

if res is not None:
    # handle response
    pass

```
<!-- End Retries [retries] -->

<!-- Start Error Handling [errors] -->
## Error Handling

Handling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.

By default, an API error will raise a models.SDKError exception, which has the following properties:

| Property        | Type             | Description           |
|-----------------|------------------|-----------------------|
| `.status_code`  | *int*            | The HTTP status code  |
| `.message`      | *str*            | The error message     |
| `.raw_response` | *httpx.Response* | The raw HTTP response |
| `.body`         | *str*            | The response content  |

When custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `list_async` method may raise the following exceptions:

| Error Type                 | Status Code | Content Type     |
| -------------------------- | ----------- | ---------------- |
| models.HTTPValidationError | 422         | application/json |
| models.SDKError            | 4XX, 5XX    | \*/\*            |

### Example

```python
from mistralai import Mistral, models
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = None
try:
    res = s.models.list()

    if res is not None:
        # handle response
        pass

except models.HTTPValidationError as e:
    # handle e.data: models.HTTPValidationErrorData
    raise(e)
except models.SDKError as e:
    # handle exception
    raise(e)
```
<!-- End Error Handling [errors] -->

<!-- Start Server Selection [server] -->
## Server Selection

### Select Server by Name

You can override the default server globally by passing a server name to the `server: str` optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:

| Name | Server                   |
| ---- | ------------------------ |
| `eu` | `https://api.mistral.ai` |

#### Example

```python
from mistralai import Mistral
import os

s = Mistral(
    server="eu",
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.models.list()

if res is not None:
    # handle response
    pass

```

### Override Server URL Per-Client

The default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:
```python
from mistralai import Mistral
import os

s = Mistral(
    server_url="https://api.mistral.ai",
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.models.list()

if res is not None:
    # handle response
    pass

```
<!-- End Server Selection [server] -->

<!-- Start Custom HTTP Client [http-client] -->
## Custom HTTP Client

The Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library.  In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.
Depending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.
This allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.

For example, you could specify a header for every request that this sdk makes as follows:
```python
from mistralai import Mistral
import httpx

http_client = httpx.Client(headers={"x-custom-header": "someValue"})
s = Mistral(client=http_client)
```

or you could wrap the client with your own custom logic:
```python
from mistralai import Mistral
from mistralai.httpclient import AsyncHttpClient
import httpx

class CustomClient(AsyncHttpClient):
    client: AsyncHttpClient

    def __init__(self, client: AsyncHttpClient):
        self.client = client

    async def send(
        self,
        request: httpx.Request,
        *,
        stream: bool = False,
        auth: Union[
            httpx._types.AuthTypes, httpx._client.UseClientDefault, None
        ] = httpx.USE_CLIENT_DEFAULT,
        follow_redirects: Union[
            bool, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
    ) -> httpx.Response:
        request.headers["Client-Level-Header"] = "added by client"

        return await self.client.send(
            request, stream=stream, auth=auth, follow_redirects=follow_redirects
        )

    def build_request(
        self,
        method: str,
        url: httpx._types.URLTypes,
        *,
        content: Optional[httpx._types.RequestContent] = None,
        data: Optional[httpx._types.RequestData] = None,
        files: Optional[httpx._types.RequestFiles] = None,
        json: Optional[Any] = None,
        params: Optional[httpx._types.QueryParamTypes] = None,
        headers: Optional[httpx._types.HeaderTypes] = None,
        cookies: Optional[httpx._types.CookieTypes] = None,
        timeout: Union[
            httpx._types.TimeoutTypes, httpx._client.UseClientDefault
        ] = httpx.USE_CLIENT_DEFAULT,
        extensions: Optional[httpx._types.RequestExtensions] = None,
    ) -> httpx.Request:
        return self.client.build_request(
            method,
            url,
            content=content,
            data=data,
            files=files,
            json=json,
            params=params,
            headers=headers,
            cookies=cookies,
            timeout=timeout,
            extensions=extensions,
        )

s = Mistral(async_client=CustomClient(httpx.AsyncClient()))
```
<!-- End Custom HTTP Client [http-client] -->

<!-- Start Authentication [security] -->
## Authentication

### Per-Client Security Schemes

This SDK supports the following security scheme globally:

| Name      | Type | Scheme      | Environment Variable |
| --------- | ---- | ----------- | -------------------- |
| `api_key` | http | HTTP Bearer | `MISTRAL_API_KEY`    |

To authenticate with the API the `api_key` parameter must be set when initializing the SDK client instance. For example:
```python
from mistralai import Mistral
import os

s = Mistral(
    api_key=os.getenv("MISTRAL_API_KEY", ""),
)

res = s.models.list()

if res is not None:
    # handle response
    pass

```
<!-- End Authentication [security] -->

<!-- Start Debugging [debug] -->
## Debugging

You can setup your SDK to emit debug logs for SDK requests and responses.

You can pass your own logger class directly into your SDK.
```python
from mistralai import Mistral
import logging

logging.basicConfig(level=logging.DEBUG)
s = Mistral(debug_logger=logging.getLogger("mistralai"))
```

You can also enable a default debug logger by setting an environment variable `MISTRAL_DEBUG` to true.
<!-- End Debugging [debug] -->

<!-- Start IDE Support [idesupport] -->
## IDE Support

### PyCharm

Generally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.

- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)
<!-- End IDE Support [idesupport] -->

<!-- Placeholder for Future Speakeasy SDK Sections -->

# Development

## Contributions

While we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. 
We look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release. 
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/mistralai/client-python.git",
    "name": "mistralai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Mistral",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/1e/bb/fee796e01fc6e48997508adc71abbf1c31a4615ed0e551c0a97be1b646e9/mistralai-1.2.2.tar.gz",
    "platform": null,
    "description": "# Mistral Python Client\n\n## Migration warning\n \nThis documentation is for Mistral AI SDK v1. You can find more details on how to migrate from v0 to v1 [here](https://github.com/mistralai/client-python/blob/master/MIGRATION.md)\n\n## API Key Setup\n\nBefore you begin, you will need a Mistral AI API key.\n\n1. Get your own Mistral API Key: <https://docs.mistral.ai/#api-access>\n2. Set your Mistral API Key as an environment variable. You only need to do this once.\n\n```bash\n# set Mistral API Key (using zsh for example)\n$ echo 'export MISTRAL_API_KEY=[your_key_here]' >> ~/.zshenv\n\n# reload the environment (or just quit and open a new terminal)\n$ source ~/.zshenv\n```\n\n<!-- Start Summary [summary] -->\n## Summary\n\nMistral AI API: Our Chat Completion and Embeddings APIs specification. Create your account on [La Plateforme](https://console.mistral.ai) to get access and read the [docs](https://docs.mistral.ai) to learn how to use it.\n<!-- End Summary [summary] -->\n\n<!-- Start Table of Contents [toc] -->\n## Table of Contents\n\n* [SDK Installation](https://github.com/mistralai/client-python/blob/master/#sdk-installation)\n* [IDE Support](https://github.com/mistralai/client-python/blob/master/#ide-support)\n* [SDK Example Usage](https://github.com/mistralai/client-python/blob/master/#sdk-example-usage)\n* [Available Resources and Operations](https://github.com/mistralai/client-python/blob/master/#available-resources-and-operations)\n* [Server-sent event streaming](https://github.com/mistralai/client-python/blob/master/#server-sent-event-streaming)\n* [File uploads](https://github.com/mistralai/client-python/blob/master/#file-uploads)\n* [Retries](https://github.com/mistralai/client-python/blob/master/#retries)\n* [Error Handling](https://github.com/mistralai/client-python/blob/master/#error-handling)\n* [Server Selection](https://github.com/mistralai/client-python/blob/master/#server-selection)\n* [Custom HTTP Client](https://github.com/mistralai/client-python/blob/master/#custom-http-client)\n* [Authentication](https://github.com/mistralai/client-python/blob/master/#authentication)\n* [Debugging](https://github.com/mistralai/client-python/blob/master/#debugging)\n<!-- End Table of Contents [toc] -->\n\n<!-- Start SDK Installation [installation] -->\n## SDK Installation\n\nThe SDK can be installed with either *pip* or *poetry* package managers.\n\n### PIP\n\n*PIP* is the default package installer for Python, enabling easy installation and management of packages from PyPI via the command line.\n\n```bash\npip install mistralai\n```\n\n### Poetry\n\n*Poetry* is a modern tool that simplifies dependency management and package publishing by using a single `pyproject.toml` file to handle project metadata and dependencies.\n\n```bash\npoetry add mistralai\n```\n<!-- End SDK Installation [installation] -->\n\n<!-- Start SDK Example Usage [usage] -->\n## SDK Example Usage\n\n### Create Chat Completions\n\nThis example shows how to create chat completions.\n\n```python\n# Synchronous Example\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.chat.complete(model=\"mistral-small-latest\", messages=[\n    {\n        \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n        \"role\": \"user\",\n    },\n])\n\nif res is not None:\n    # handle response\n    pass\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai import Mistral\nimport os\n\nasync def main():\n    s = Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    )\n    res = await s.chat.complete_async(model=\"mistral-small-latest\", messages=[\n        {\n            \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n            \"role\": \"user\",\n        },\n    ])\n    if res is not None:\n        # handle response\n        pass\n\nasyncio.run(main())\n```\n\n### Upload a file\n\nThis example shows how to upload a file.\n\n```python\n# Synchronous Example\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.files.upload(file={\n    \"file_name\": \"example.file\",\n    \"content\": open(\"example.file\", \"rb\"),\n})\n\nif res is not None:\n    # handle response\n    pass\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai import Mistral\nimport os\n\nasync def main():\n    s = Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    )\n    res = await s.files.upload_async(file={\n        \"file_name\": \"example.file\",\n        \"content\": open(\"example.file\", \"rb\"),\n    })\n    if res is not None:\n        # handle response\n        pass\n\nasyncio.run(main())\n```\n\n### Create Agents Completions\n\nThis example shows how to create agents completions.\n\n```python\n# Synchronous Example\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.agents.complete(messages=[\n    {\n        \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n        \"role\": \"user\",\n    },\n], agent_id=\"<value>\")\n\nif res is not None:\n    # handle response\n    pass\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai import Mistral\nimport os\n\nasync def main():\n    s = Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    )\n    res = await s.agents.complete_async(messages=[\n        {\n            \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n            \"role\": \"user\",\n        },\n    ], agent_id=\"<value>\")\n    if res is not None:\n        # handle response\n        pass\n\nasyncio.run(main())\n```\n\n### Create Embedding Request\n\nThis example shows how to create embedding request.\n\n```python\n# Synchronous Example\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.embeddings.create(inputs=[\n    \"Embed this sentence.\",\n    \"As well as this one.\",\n], model=\"Wrangler\")\n\nif res is not None:\n    # handle response\n    pass\n```\n\n</br>\n\nThe same SDK client can also be used to make asychronous requests by importing asyncio.\n```python\n# Asynchronous Example\nimport asyncio\nfrom mistralai import Mistral\nimport os\n\nasync def main():\n    s = Mistral(\n        api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n    )\n    res = await s.embeddings.create_async(inputs=[\n        \"Embed this sentence.\",\n        \"As well as this one.\",\n    ], model=\"Wrangler\")\n    if res is not None:\n        # handle response\n        pass\n\nasyncio.run(main())\n```\n<!-- End SDK Example Usage [usage] -->\n\n\n### More examples\n\nYou can run the examples in the `examples/` directory using `poetry run` or by entering the virtual environment using `poetry shell`.\n\n\n## Providers' SDKs Example Usage\n\n### Azure AI\n\n**Prerequisites**\n\nBefore you begin, ensure you have `AZUREAI_ENDPOINT` and an `AZURE_API_KEY`. To obtain these, you will need to deploy Mistral on Azure AI.\nSee [instructions for deploying Mistral on Azure AI here](https://docs.mistral.ai/deployment/cloud/azure/).\n\nHere's a basic example to get you started. You can also run [the example in the `examples` directory](https://github.com/mistralai/client-python/blob/master//examples/azure).\n\n```python\nimport asyncio\nimport os\n\nfrom mistralai_azure import MistralAzure\n\nclient = MistralAzure(\n    azure_api_key=os.getenv(\"AZURE_API_KEY\", \"\"),\n    azure_endpoint=os.getenv(\"AZURE_ENDPOINT\", \"\")\n)\n\nasync def main() -> None:\n    res = await client.chat.complete_async( \n        max_tokens= 100,\n        temperature= 0.5,\n        messages= [\n            {\n                \"content\": \"Hello there!\",\n                \"role\": \"user\"\n            }\n        ]\n    )\n    print(res)\n\nasyncio.run(main())\n```\nThe documentation for the Azure SDK is available [here](https://github.com/mistralai/client-python/blob/master/packages/mistralai_azure/README.md).\n\n### Google Cloud\n\n\n**Prerequisites**\n\nBefore you begin, you will need to create a Google Cloud project and enable the Mistral API. To do this, follow the instructions [here](https://docs.mistral.ai/deployment/cloud/vertex/).\n\nTo run this locally you will also need to ensure you are authenticated with Google Cloud. You can do this by running\n\n```bash\ngcloud auth application-default login\n```\n\n**Step 1: Install**\n\nInstall the extras dependencies specific to Google Cloud:\n\n```bash\npip install mistralai[gcp]\n```\n\n**Step 2: Example Usage**\n\nHere's a basic example to get you started.\n\n```python\nimport asyncio\nfrom mistralai_gcp import MistralGoogleCloud\n\nclient = MistralGoogleCloud()\n\n\nasync def main() -> None:\n    res = await client.chat.complete_async(\n        model= \"mistral-small-2402\",\n        messages= [\n            {\n                \"content\": \"Hello there!\",\n                \"role\": \"user\"\n            }\n        ]\n    )\n    print(res)\n\nasyncio.run(main())\n```\n\nThe documentation for the GCP SDK is available [here](https://github.com/mistralai/client-python/blob/master/packages/mistralai_gcp/README.md).\n\n\n<!-- Start Available Resources and Operations [operations] -->\n## Available Resources and Operations\n\n<details open>\n<summary>Available methods</summary>\n\n### [agents](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md)\n\n* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md#complete) - Agents Completion\n* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/agents/README.md#stream) - Stream Agents completion\n\n### [batch](https://github.com/mistralai/client-python/blob/master/docs/sdks/batch/README.md)\n\n\n#### [batch.jobs](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md)\n\n* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#list) - Get Batch Jobs\n* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#create) - Create Batch Job\n* [get](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#get) - Get Batch Job\n* [cancel](https://github.com/mistralai/client-python/blob/master/docs/sdks/mistraljobs/README.md#cancel) - Cancel Batch Job\n\n### [chat](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md)\n\n* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md#complete) - Chat Completion\n* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/chat/README.md#stream) - Stream chat completion\n\n### [classifiers](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md)\n\n* [moderate](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md#moderate) - Moderations\n* [moderate_chat](https://github.com/mistralai/client-python/blob/master/docs/sdks/classifiers/README.md#moderate_chat) - Moderations Chat\n\n### [embeddings](https://github.com/mistralai/client-python/blob/master/docs/sdks/embeddings/README.md)\n\n* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/embeddings/README.md#create) - Embeddings\n\n### [files](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md)\n\n* [upload](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#upload) - Upload File\n* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#list) - List Files\n* [retrieve](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#retrieve) - Retrieve File\n* [delete](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#delete) - Delete File\n* [download](https://github.com/mistralai/client-python/blob/master/docs/sdks/files/README.md#download) - Download File\n\n### [fim](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md)\n\n* [complete](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md#complete) - Fim Completion\n* [stream](https://github.com/mistralai/client-python/blob/master/docs/sdks/fim/README.md#stream) - Stream fim completion\n\n### [fine_tuning](https://github.com/mistralai/client-python/blob/master/docs/sdks/finetuning/README.md)\n\n\n#### [fine_tuning.jobs](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md)\n\n* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#list) - Get Fine Tuning Jobs\n* [create](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#create) - Create Fine Tuning Job\n* [get](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#get) - Get Fine Tuning Job\n* [cancel](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#cancel) - Cancel Fine Tuning Job\n* [start](https://github.com/mistralai/client-python/blob/master/docs/sdks/jobs/README.md#start) - Start Fine Tuning Job\n\n\n### [models](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md)\n\n* [list](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#list) - List Models\n* [retrieve](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#retrieve) - Retrieve Model\n* [delete](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#delete) - Delete Model\n* [update](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#update) - Update Fine Tuned Model\n* [archive](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#archive) - Archive Fine Tuned Model\n* [unarchive](https://github.com/mistralai/client-python/blob/master/docs/sdks/models/README.md#unarchive) - Unarchive Fine Tuned Model\n\n</details>\n<!-- End Available Resources and Operations [operations] -->\n\n<!-- Start Server-sent event streaming [eventstream] -->\n## Server-sent event streaming\n\n[Server-sent events][mdn-sse] are used to stream content from certain\noperations. These operations will expose the stream as [Generator][generator] that\ncan be consumed using a simple `for` loop. The loop will\nterminate when the server no longer has any events to send and closes the\nunderlying connection.\n\n```python\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.chat.stream(model=\"mistral-small-latest\", messages=[\n    {\n        \"content\": \"Who is the best French painter? Answer in one short sentence.\",\n        \"role\": \"user\",\n    },\n])\n\nif res is not None:\n    for event in res:\n        # handle event\n        print(event, flush=True)\n\n```\n\n[mdn-sse]: https://developer.mozilla.org/en-US/docs/Web/API/Server-sent_events/Using_server-sent_events\n[generator]: https://wiki.python.org/moin/Generators\n<!-- End Server-sent event streaming [eventstream] -->\n\n<!-- Start File uploads [file-upload] -->\n## File uploads\n\nCertain SDK methods accept file objects as part of a request body or multi-part request. It is possible and typically recommended to upload files as a stream rather than reading the entire contents into memory. This avoids excessive memory consumption and potentially crashing with out-of-memory errors when working with very large files. The following example demonstrates how to attach a file stream to a request.\n\n> [!TIP]\n>\n> For endpoints that handle file uploads bytes arrays can also be used. However, using streams is recommended for large files.\n>\n\n```python\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.files.upload(file={\n    \"file_name\": \"example.file\",\n    \"content\": open(\"example.file\", \"rb\"),\n})\n\nif res is not None:\n    # handle response\n    pass\n\n```\n<!-- End File uploads [file-upload] -->\n\n<!-- Start Retries [retries] -->\n## Retries\n\nSome of the endpoints in this SDK support retries. If you use the SDK without any configuration, it will fall back to the default retry strategy provided by the API. However, the default retry strategy can be overridden on a per-operation basis, or across the entire SDK.\n\nTo change the default retry strategy for a single API call, simply provide a `RetryConfig` object to the call:\n```python\nfrom mistral.utils import BackoffStrategy, RetryConfig\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.models.list(,\n    RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False))\n\nif res is not None:\n    # handle response\n    pass\n\n```\n\nIf you'd like to override the default retry strategy for all operations that support retries, you can use the `retry_config` optional parameter when initializing the SDK:\n```python\nfrom mistral.utils import BackoffStrategy, RetryConfig\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    retry_config=RetryConfig(\"backoff\", BackoffStrategy(1, 50, 1.1, 100), False),\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.models.list()\n\nif res is not None:\n    # handle response\n    pass\n\n```\n<!-- End Retries [retries] -->\n\n<!-- Start Error Handling [errors] -->\n## Error Handling\n\nHandling errors in this SDK should largely match your expectations. All operations return a response object or raise an exception.\n\nBy default, an API error will raise a models.SDKError exception, which has the following properties:\n\n| Property        | Type             | Description           |\n|-----------------|------------------|-----------------------|\n| `.status_code`  | *int*            | The HTTP status code  |\n| `.message`      | *str*            | The error message     |\n| `.raw_response` | *httpx.Response* | The raw HTTP response |\n| `.body`         | *str*            | The response content  |\n\nWhen custom error responses are specified for an operation, the SDK may also raise their associated exceptions. You can refer to respective *Errors* tables in SDK docs for more details on possible exception types for each operation. For example, the `list_async` method may raise the following exceptions:\n\n| Error Type                 | Status Code | Content Type     |\n| -------------------------- | ----------- | ---------------- |\n| models.HTTPValidationError | 422         | application/json |\n| models.SDKError            | 4XX, 5XX    | \\*/\\*            |\n\n### Example\n\n```python\nfrom mistralai import Mistral, models\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = None\ntry:\n    res = s.models.list()\n\n    if res is not None:\n        # handle response\n        pass\n\nexcept models.HTTPValidationError as e:\n    # handle e.data: models.HTTPValidationErrorData\n    raise(e)\nexcept models.SDKError as e:\n    # handle exception\n    raise(e)\n```\n<!-- End Error Handling [errors] -->\n\n<!-- Start Server Selection [server] -->\n## Server Selection\n\n### Select Server by Name\n\nYou can override the default server globally by passing a server name to the `server: str` optional parameter when initializing the SDK client instance. The selected server will then be used as the default on the operations that use it. This table lists the names associated with the available servers:\n\n| Name | Server                   |\n| ---- | ------------------------ |\n| `eu` | `https://api.mistral.ai` |\n\n#### Example\n\n```python\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    server=\"eu\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.models.list()\n\nif res is not None:\n    # handle response\n    pass\n\n```\n\n### Override Server URL Per-Client\n\nThe default server can also be overridden globally by passing a URL to the `server_url: str` optional parameter when initializing the SDK client instance. For example:\n```python\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    server_url=\"https://api.mistral.ai\",\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.models.list()\n\nif res is not None:\n    # handle response\n    pass\n\n```\n<!-- End Server Selection [server] -->\n\n<!-- Start Custom HTTP Client [http-client] -->\n## Custom HTTP Client\n\nThe Python SDK makes API calls using the [httpx](https://www.python-httpx.org/) HTTP library.  In order to provide a convenient way to configure timeouts, cookies, proxies, custom headers, and other low-level configuration, you can initialize the SDK client with your own HTTP client instance.\nDepending on whether you are using the sync or async version of the SDK, you can pass an instance of `HttpClient` or `AsyncHttpClient` respectively, which are Protocol's ensuring that the client has the necessary methods to make API calls.\nThis allows you to wrap the client with your own custom logic, such as adding custom headers, logging, or error handling, or you can just pass an instance of `httpx.Client` or `httpx.AsyncClient` directly.\n\nFor example, you could specify a header for every request that this sdk makes as follows:\n```python\nfrom mistralai import Mistral\nimport httpx\n\nhttp_client = httpx.Client(headers={\"x-custom-header\": \"someValue\"})\ns = Mistral(client=http_client)\n```\n\nor you could wrap the client with your own custom logic:\n```python\nfrom mistralai import Mistral\nfrom mistralai.httpclient import AsyncHttpClient\nimport httpx\n\nclass CustomClient(AsyncHttpClient):\n    client: AsyncHttpClient\n\n    def __init__(self, client: AsyncHttpClient):\n        self.client = client\n\n    async def send(\n        self,\n        request: httpx.Request,\n        *,\n        stream: bool = False,\n        auth: Union[\n            httpx._types.AuthTypes, httpx._client.UseClientDefault, None\n        ] = httpx.USE_CLIENT_DEFAULT,\n        follow_redirects: Union[\n            bool, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n    ) -> httpx.Response:\n        request.headers[\"Client-Level-Header\"] = \"added by client\"\n\n        return await self.client.send(\n            request, stream=stream, auth=auth, follow_redirects=follow_redirects\n        )\n\n    def build_request(\n        self,\n        method: str,\n        url: httpx._types.URLTypes,\n        *,\n        content: Optional[httpx._types.RequestContent] = None,\n        data: Optional[httpx._types.RequestData] = None,\n        files: Optional[httpx._types.RequestFiles] = None,\n        json: Optional[Any] = None,\n        params: Optional[httpx._types.QueryParamTypes] = None,\n        headers: Optional[httpx._types.HeaderTypes] = None,\n        cookies: Optional[httpx._types.CookieTypes] = None,\n        timeout: Union[\n            httpx._types.TimeoutTypes, httpx._client.UseClientDefault\n        ] = httpx.USE_CLIENT_DEFAULT,\n        extensions: Optional[httpx._types.RequestExtensions] = None,\n    ) -> httpx.Request:\n        return self.client.build_request(\n            method,\n            url,\n            content=content,\n            data=data,\n            files=files,\n            json=json,\n            params=params,\n            headers=headers,\n            cookies=cookies,\n            timeout=timeout,\n            extensions=extensions,\n        )\n\ns = Mistral(async_client=CustomClient(httpx.AsyncClient()))\n```\n<!-- End Custom HTTP Client [http-client] -->\n\n<!-- Start Authentication [security] -->\n## Authentication\n\n### Per-Client Security Schemes\n\nThis SDK supports the following security scheme globally:\n\n| Name      | Type | Scheme      | Environment Variable |\n| --------- | ---- | ----------- | -------------------- |\n| `api_key` | http | HTTP Bearer | `MISTRAL_API_KEY`    |\n\nTo authenticate with the API the `api_key` parameter must be set when initializing the SDK client instance. For example:\n```python\nfrom mistralai import Mistral\nimport os\n\ns = Mistral(\n    api_key=os.getenv(\"MISTRAL_API_KEY\", \"\"),\n)\n\nres = s.models.list()\n\nif res is not None:\n    # handle response\n    pass\n\n```\n<!-- End Authentication [security] -->\n\n<!-- Start Debugging [debug] -->\n## Debugging\n\nYou can setup your SDK to emit debug logs for SDK requests and responses.\n\nYou can pass your own logger class directly into your SDK.\n```python\nfrom mistralai import Mistral\nimport logging\n\nlogging.basicConfig(level=logging.DEBUG)\ns = Mistral(debug_logger=logging.getLogger(\"mistralai\"))\n```\n\nYou can also enable a default debug logger by setting an environment variable `MISTRAL_DEBUG` to true.\n<!-- End Debugging [debug] -->\n\n<!-- Start IDE Support [idesupport] -->\n## IDE Support\n\n### PyCharm\n\nGenerally, the SDK will work well with most IDEs out of the box. However, when using PyCharm, you can enjoy much better integration with Pydantic by installing an additional plugin.\n\n- [PyCharm Pydantic Plugin](https://docs.pydantic.dev/latest/integrations/pycharm/)\n<!-- End IDE Support [idesupport] -->\n\n<!-- Placeholder for Future Speakeasy SDK Sections -->\n\n# Development\n\n## Contributions\n\nWhile we value open-source contributions to this SDK, this library is generated programmatically. Any manual changes added to internal files will be overwritten on the next generation. \nWe look forward to hearing your feedback. Feel free to open a PR or an issue with a proof of concept and we'll do our best to include it in a future release. ",
    "bugtrack_url": null,
    "license": null,
    "summary": "Python Client SDK for the Mistral AI API.",
    "version": "1.2.2",
    "project_urls": {
        "Homepage": "https://github.com/mistralai/client-python.git",
        "Repository": "https://github.com/mistralai/client-python.git"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "58763cdbd29e240ed42fc0bdaedab0c2aab0c98f0c315327dd5ed3800a53f756",
                "md5": "592f70dd63b4f536d085a3bd17bbe84d",
                "sha256": "f34e1e05712b0224af9ea6ffbd71a2671543e8eee4af6f22d279aa22e0bece56"
            },
            "downloads": -1,
            "filename": "mistralai-1.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "592f70dd63b4f536d085a3bd17bbe84d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 254566,
            "upload_time": "2024-11-12T18:29:20",
            "upload_time_iso_8601": "2024-11-12T18:29:20.412361Z",
            "url": "https://files.pythonhosted.org/packages/58/76/3cdbd29e240ed42fc0bdaedab0c2aab0c98f0c315327dd5ed3800a53f756/mistralai-1.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1ebbfee796e01fc6e48997508adc71abbf1c31a4615ed0e551c0a97be1b646e9",
                "md5": "82efa82efe6640f381ccc61d15256a12",
                "sha256": "51d53336aebba7dd936a320e8b51d09f5dc30c5de8b21c080b1cad9aab6f561b"
            },
            "downloads": -1,
            "filename": "mistralai-1.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "82efa82efe6640f381ccc61d15256a12",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 129103,
            "upload_time": "2024-11-12T18:29:22",
            "upload_time_iso_8601": "2024-11-12T18:29:22.373965Z",
            "url": "https://files.pythonhosted.org/packages/1e/bb/fee796e01fc6e48997508adc71abbf1c31a4615ed0e551c0a97be1b646e9/mistralai-1.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-12 18:29:22",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "mistralai",
    "github_project": "client-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mistralai"
}
        
Elapsed time: 0.43429s