openinference-instrumentation-mistralai


Nameopeninference-instrumentation-mistralai JSON
Version 1.3.4 PyPI version JSON
download
home_pageNone
SummaryOpenInference Mistral AI Instrumentation
upload_time2025-10-10 03:49:02
maintainerNone
docs_urlNone
authorNone
requires_python<3.15,>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements openai python-dotenv grpcio opentelemetry-api opentelemetry-sdk opentelemetry-instrumentation-openai opentelemetry-instrumentation-langchain opentelemetry-exporter-otlp-proto-grpc langchain langgraph typing-extensions phoenix-client openinference-instrumentation-openllmetry
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenInference Mistral AI Instrumentation
[![PyPI Version](https://img.shields.io/pypi/v/openinference-instrumentation-mistralai.svg)](https://pypi.python.org/pypi/openinference-instrumentation-mistralai) 

Python autoinstrumentation library for MistralAI's Python SDK.

The traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix)

## Installation

```shell
pip install openinference-instrumentation-mistralai
```

## Quickstart

In this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces via [`arize-phoenix`](https://github.com/Arize-ai/phoenix).

Install packages.

```shell
pip install openinference-instrumentation-mistralai mistralai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
```

Start the phoenix server so that it is ready to collect traces.
The Phoenix server runs entirely on your machine and does not send data over the internet.

```shell
python -m phoenix.server.main serve
```

In a python file, setup the `MistralAIInstrumentor` and configure the tracer to send traces to Phoenix.

```python
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
from openinference.instrumentation.mistralai import MistralAIInstrumentor
from opentelemetry import trace as trace_api
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor

endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
trace_api.set_tracer_provider(tracer_provider)

MistralAIInstrumentor().instrument()


if __name__ == "__main__":
    client = MistralClient()
    response = client.chat(
        model="mistral-large-latest",
        messages=[
            ChatMessage(
                content="Who won the World Cup in 2018?",
                role="user",
            )
        ],
    )
    print(response.choices[0].message.content)

```

Since we are using MistralAI, we must set the `MISTRAL_API_KEY` environment variable to authenticate with the MistralAI API.

```shell
export MISTRAL_API_KEY=[your_key_here]
```

Now simply run the python file and observe the traces in Phoenix.

```shell
python your_file.py
```

## More Info

* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "openinference-instrumentation-mistralai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.15,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "OpenInference Authors <oss@arize.com>",
    "download_url": "https://files.pythonhosted.org/packages/62/54/edaa11344b17a0aae6deec44beb86d3d15dd61abad21488349b1918aadc8/openinference_instrumentation_mistralai-1.3.4.tar.gz",
    "platform": null,
    "description": "# OpenInference Mistral AI Instrumentation\n[![PyPI Version](https://img.shields.io/pypi/v/openinference-instrumentation-mistralai.svg)](https://pypi.python.org/pypi/openinference-instrumentation-mistralai) \n\nPython autoinstrumentation library for MistralAI's Python SDK.\n\nThe traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix)\n\n## Installation\n\n```shell\npip install openinference-instrumentation-mistralai\n```\n\n## Quickstart\n\nIn this example we will instrument a small program that uses the MistralAI chat completions API and observe the traces via [`arize-phoenix`](https://github.com/Arize-ai/phoenix).\n\nInstall packages.\n\n```shell\npip install openinference-instrumentation-mistralai mistralai arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp\n```\n\nStart the phoenix server so that it is ready to collect traces.\nThe Phoenix server runs entirely on your machine and does not send data over the internet.\n\n```shell\npython -m phoenix.server.main serve\n```\n\nIn a python file, setup the `MistralAIInstrumentor` and configure the tracer to send traces to Phoenix.\n\n```python\nfrom mistralai.client import MistralClient\nfrom mistralai.models.chat_completion import ChatMessage\nfrom openinference.instrumentation.mistralai import MistralAIInstrumentor\nfrom opentelemetry import trace as trace_api\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor\n\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n# Optionally, you can also print the spans to the console.\ntracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))\ntrace_api.set_tracer_provider(tracer_provider)\n\nMistralAIInstrumentor().instrument()\n\n\nif __name__ == \"__main__\":\n    client = MistralClient()\n    response = client.chat(\n        model=\"mistral-large-latest\",\n        messages=[\n            ChatMessage(\n                content=\"Who won the World Cup in 2018?\",\n                role=\"user\",\n            )\n        ],\n    )\n    print(response.choices[0].message.content)\n\n```\n\nSince we are using MistralAI, we must set the `MISTRAL_API_KEY` environment variable to authenticate with the MistralAI API.\n\n```shell\nexport MISTRAL_API_KEY=[your_key_here]\n```\n\nNow simply run the python file and observe the traces in Phoenix.\n\n```shell\npython your_file.py\n```\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "OpenInference Mistral AI Instrumentation",
    "version": "1.3.4",
    "project_urls": {
        "Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-mistralai"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f5c6e0a7aae0aeaccd74dfd440eef58ab34c60995d8dd13e5f54fd985a1d04f0",
                "md5": "a51ce82362e1ed87d792797eb13193a0",
                "sha256": "957442319617c26f24cdd8e24794dc88bf62af72fc13029687902e628924e418"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_mistralai-1.3.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a51ce82362e1ed87d792797eb13193a0",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.15,>=3.9",
            "size": 20394,
            "upload_time": "2025-10-10T03:48:48",
            "upload_time_iso_8601": "2025-10-10T03:48:48.952654Z",
            "url": "https://files.pythonhosted.org/packages/f5/c6/e0a7aae0aeaccd74dfd440eef58ab34c60995d8dd13e5f54fd985a1d04f0/openinference_instrumentation_mistralai-1.3.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "6254edaa11344b17a0aae6deec44beb86d3d15dd61abad21488349b1918aadc8",
                "md5": "91c01568478ebcce654ea60e4060b475",
                "sha256": "34caaaba060a4b7cf6cd18eb9a1467b64a7299887734bb0740f630af61a5dc88"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation_mistralai-1.3.4.tar.gz",
            "has_sig": false,
            "md5_digest": "91c01568478ebcce654ea60e4060b475",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.15,>=3.9",
            "size": 21335,
            "upload_time": "2025-10-10T03:49:02",
            "upload_time_iso_8601": "2025-10-10T03:49:02.306059Z",
            "url": "https://files.pythonhosted.org/packages/62/54/edaa11344b17a0aae6deec44beb86d3d15dd61abad21488349b1918aadc8/openinference_instrumentation_mistralai-1.3.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-10 03:49:02",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Arize-ai",
    "github_project": "openinference",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "openai",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "python-dotenv",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "grpcio",
            "specs": [
                [
                    ">=",
                    "1.60.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-api",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-sdk",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "opentelemetry-instrumentation-openai",
            "specs": [
                [
                    ">=",
                    "0.40b0"
                ]
            ]
        },
        {
            "name": "opentelemetry-instrumentation-langchain",
            "specs": [
                [
                    ">=",
                    "0.40b0"
                ]
            ]
        },
        {
            "name": "opentelemetry-exporter-otlp-proto-grpc",
            "specs": [
                [
                    ">=",
                    "1.22.0"
                ]
            ]
        },
        {
            "name": "langchain",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        },
        {
            "name": "langgraph",
            "specs": [
                [
                    ">=",
                    "0.0.15"
                ]
            ]
        },
        {
            "name": "typing-extensions",
            "specs": [
                [
                    ">=",
                    "4.8.0"
                ]
            ]
        },
        {
            "name": "phoenix-client",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        },
        {
            "name": "openinference-instrumentation-openllmetry",
            "specs": [
                [
                    ">=",
                    "0.1.0"
                ]
            ]
        }
    ],
    "lcname": "openinference-instrumentation-mistralai"
}
        
Elapsed time: 2.65653s