openinference-instrumentation


Nameopeninference-instrumentation JSON
Version 0.1.22 PyPI version JSON
download
home_pageNone
SummaryOpenInference instrumentation utilities
upload_time2025-02-04 23:11:15
maintainerNone
docs_urlNone
authorNone
requires_python<3.14,>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenInference Instrumentation

[![PyPI Version](https://img.shields.io/pypi/v/openinference-instrumentation.svg)](https://pypi.python.org/pypi/openinference-instrumentation) 

Utility functions for OpenInference instrumentation.

## Installation

```shell
pip install openinference-instrumentation
```

## Customizing Spans

The `openinference-instrumentation` package offers utilities to track important application metadata such as sessions and metadata using Python context managers:

* `using_session`: to specify a session ID to track and group a multi-turn conversation with a user
* `using_user`: to specify a user ID to track different conversations with a given user
* `using_metadata`: to add custom metadata, that can provide extra information that supports a wide range of operational needs
* `using_tag`: to add tags, to help filter on specific keywords
* `using_prompt_template`: to reflect the prompt template used, with its version and variables. This is useful for prompt template management
* `using_attributes`: it helps handling multiple of the previous options at once in a concise manner
  
For example:

```python
from openinference.instrumentation import using_attributes
tags = ["business_critical", "simple", ...]
metadata = {
    "country": "United States",
    "topic":"weather",
    ...
}
prompt_template = "Please describe the weather forecast for {city} on {date}"
prompt_template_variables = {"city": "Johannesburg", "date":"July 11"}
prompt_template_version = "v1.0"
with using_attributes(
    session_id="my-session-id",
    user_id="my-user-id",
    metadata=metadata,
    tags=tags,
    prompt_template=prompt_template,
    prompt_template_version=prompt_template_version,
    prompt_template_variables=prompt_template_variables,
):
    # Calls within this block will generate spans with the attributes:
    # "session.id" = "my-session-id"
    # "user.id" = "my-user-id"
    # "metadata" = "{\"key-1\": value_1, \"key-2\": value_2, ... }" # JSON serialized
    # "tag.tags" = "["tag_1","tag_2",...]"
    # "llm.prompt_template.template" = "Please describe the weather forecast for {city} on {date}"
    # "llm.prompt_template.variables" = "{\"city\": \"Johannesburg\", \"date\": \"July 11\"}" # JSON serialized
    # "llm.prompt_template.version " = "v1.0"
    ...
```

You can read more about this in our [docs](https://docs.arize.com/phoenix/tracing/how-to-tracing/customize-spans).

## Tracing Configuration

This package contains the central `TraceConfig` class, which lets you specify a tracing configuration that lets you control settings like data privacy and payload sizes. For instance, you may want to keep sensitive information from being logged for security reasons, or you may want to limit the size of the base64 encoded images logged to reduced payload size.

In addition, you an also use environment variables, read more [here](../../spec/configuration.md). The following is an example of using the `TraceConfig` object:

```python
from openinference.instrumentation import TraceConfig
config = TraceConfig(
    hide_inputs=hide_inputs,
    hide_outputs=hide_outputs,
    hide_input_messages=hide_input_messages,
    hide_output_messages=hide_output_messages,
    hide_input_images=hide_input_images,
    hide_input_text=hide_input_text,
    hide_output_text=hide_output_text,
    base64_image_max_length=base64_image_max_length,
)
tracer_provider=...
# This example uses the OpenAIInstrumentor, but it works with any of our auto instrumentors
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider, config=config)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "openinference-instrumentation",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.14,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "OpenInference Authors <oss@arize.com>",
    "download_url": "https://files.pythonhosted.org/packages/78/fd/817518595428c546cd1df50bc2d49428d094573578ca65de31a4d40f5242/openinference_instrumentation-0.1.22.tar.gz",
    "platform": null,
    "description": "# OpenInference Instrumentation\n\n[![PyPI Version](https://img.shields.io/pypi/v/openinference-instrumentation.svg)](https://pypi.python.org/pypi/openinference-instrumentation) \n\nUtility functions for OpenInference instrumentation.\n\n## Installation\n\n```shell\npip install openinference-instrumentation\n```\n\n## Customizing Spans\n\nThe `openinference-instrumentation` package offers utilities to track important application metadata such as sessions and metadata using Python context managers:\n\n* `using_session`: to specify a session ID to track and group a multi-turn conversation with a user\n* `using_user`: to specify a user ID to track different conversations with a given user\n* `using_metadata`: to add custom metadata, that can provide extra information that supports a wide range of operational needs\n* `using_tag`: to add tags, to help filter on specific keywords\n* `using_prompt_template`: to reflect the prompt template used, with its version and variables. This is useful for prompt template management\n* `using_attributes`: it helps handling multiple of the previous options at once in a concise manner\n  \nFor example:\n\n```python\nfrom openinference.instrumentation import using_attributes\ntags = [\"business_critical\", \"simple\", ...]\nmetadata = {\n    \"country\": \"United States\",\n    \"topic\":\"weather\",\n    ...\n}\nprompt_template = \"Please describe the weather forecast for {city} on {date}\"\nprompt_template_variables = {\"city\": \"Johannesburg\", \"date\":\"July 11\"}\nprompt_template_version = \"v1.0\"\nwith using_attributes(\n    session_id=\"my-session-id\",\n    user_id=\"my-user-id\",\n    metadata=metadata,\n    tags=tags,\n    prompt_template=prompt_template,\n    prompt_template_version=prompt_template_version,\n    prompt_template_variables=prompt_template_variables,\n):\n    # Calls within this block will generate spans with the attributes:\n    # \"session.id\" = \"my-session-id\"\n    # \"user.id\" = \"my-user-id\"\n    # \"metadata\" = \"{\\\"key-1\\\": value_1, \\\"key-2\\\": value_2, ... }\" # JSON serialized\n    # \"tag.tags\" = \"[\"tag_1\",\"tag_2\",...]\"\n    # \"llm.prompt_template.template\" = \"Please describe the weather forecast for {city} on {date}\"\n    # \"llm.prompt_template.variables\" = \"{\\\"city\\\": \\\"Johannesburg\\\", \\\"date\\\": \\\"July 11\\\"}\" # JSON serialized\n    # \"llm.prompt_template.version \" = \"v1.0\"\n    ...\n```\n\nYou can read more about this in our [docs](https://docs.arize.com/phoenix/tracing/how-to-tracing/customize-spans).\n\n## Tracing Configuration\n\nThis package contains the central `TraceConfig` class, which lets you specify a tracing configuration that lets you control settings like data privacy and payload sizes. For instance, you may want to keep sensitive information from being logged for security reasons, or you may want to limit the size of the base64 encoded images logged to reduced payload size.\n\nIn addition, you an also use environment variables, read more [here](../../spec/configuration.md). The following is an example of using the `TraceConfig` object:\n\n```python\nfrom openinference.instrumentation import TraceConfig\nconfig = TraceConfig(\n    hide_inputs=hide_inputs,\n    hide_outputs=hide_outputs,\n    hide_input_messages=hide_input_messages,\n    hide_output_messages=hide_output_messages,\n    hide_input_images=hide_input_images,\n    hide_input_text=hide_input_text,\n    hide_output_text=hide_output_text,\n    base64_image_max_length=base64_image_max_length,\n)\ntracer_provider=...\n# This example uses the OpenAIInstrumentor, but it works with any of our auto instrumentors\nOpenAIInstrumentor().instrument(tracer_provider=tracer_provider, config=config)\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "OpenInference instrumentation utilities",
    "version": "0.1.22",
    "project_urls": {
        "Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "772fefe21f21680411d5cd523fcc626a589c41ae49c47c94a001a54c890067c1",
                "md5": "a223218344e77e7af4642a3937e3ba46",
                "sha256": "56f5449c2fc5392fba4a8acca99f82206e6103d62f55fd6b38c82edb590e2ef9"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation-0.1.22-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a223218344e77e7af4642a3937e3ba46",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.14,>=3.8",
            "size": 19372,
            "upload_time": "2025-02-04T23:10:54",
            "upload_time_iso_8601": "2025-02-04T23:10:54.329844Z",
            "url": "https://files.pythonhosted.org/packages/77/2f/efe21f21680411d5cd523fcc626a589c41ae49c47c94a001a54c890067c1/openinference_instrumentation-0.1.22-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "78fd817518595428c546cd1df50bc2d49428d094573578ca65de31a4d40f5242",
                "md5": "58d55f5ed8a669a3728087fa91bced90",
                "sha256": "9d45ad541b8155ab48318335f9649c654e447240f3bd8890dc98f2fec1d66d4b"
            },
            "downloads": -1,
            "filename": "openinference_instrumentation-0.1.22.tar.gz",
            "has_sig": false,
            "md5_digest": "58d55f5ed8a669a3728087fa91bced90",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.14,>=3.8",
            "size": 16913,
            "upload_time": "2025-02-04T23:11:15",
            "upload_time_iso_8601": "2025-02-04T23:11:15.303124Z",
            "url": "https://files.pythonhosted.org/packages/78/fd/817518595428c546cd1df50bc2d49428d094573578ca65de31a4d40f5242/openinference_instrumentation-0.1.22.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-04 23:11:15",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Arize-ai",
    "github_project": "openinference",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openinference-instrumentation"
}
        
Elapsed time: 1.46687s