Name | openinference-instrumentation-openai JSON |
Version |
0.1.19
JSON |
| download |
home_page | None |
Summary | OpenInference OpenAI Instrumentation |
upload_time | 2025-02-04 23:11:09 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.14,>=3.9 |
license | None |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# OpenInference OpenAI Instrumentation
[](https://pypi.org/project/openinference-instrumentation-openai/)
Python auto-instrumentation library for OpenAI's python SDK.
The traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix)
## Installation
```shell
pip install openinference-instrumentation-openai
```
## Quickstart
In this example we will instrument a small program that uses OpenAI and observe the traces via [`arize-phoenix`](https://github.com/Arize-ai/phoenix).
Install packages.
```shell
pip install openinference-instrumentation-openai "openai>=1.26" arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
```
Start the phoenix server so that it is ready to collect traces.
The Phoenix server runs entirely on your machine and does not send data over the internet.
```shell
python -m phoenix.server.main serve
```
In a python file, setup the `OpenAIInstrumentor` and configure the tracer to send traces to Phoenix.
```python
import openai
from openinference.instrumentation.openai import OpenAIInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
# Optionally, you can also print the spans to the console.
tracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
if __name__ == "__main__":
client = openai.OpenAI()
response = client.chat.completions.create(
model="gpt-3.5-turbo",
messages=[{"role": "user", "content": "Write a haiku."}],
max_tokens=20,
stream=True,
stream_options={"include_usage": True},
)
for chunk in response:
if chunk.choices and (content := chunk.choices[0].delta.content):
print(content, end="")
```
Since we are using OpenAI, we must set the `OPENAI_API_KEY` environment variable to authenticate with the OpenAI API.
```shell
export OPENAI_API_KEY=your-api-key
```
Now simply run the python file and observe the traces in Phoenix.
```shell
python your_file.py
```
## FAQ
**Q: How to get token counts when streaming?**
**A:** To get token counts when streaming, install `openai>=1.26` and set `stream_options={"include_usage": True}` when calling `create`. See the example shown above. For more info, see [here](https://community.openai.com/t/usage-stats-now-available-when-using-streaming-with-the-chat-completions-api-or-completions-api/738156).
## More Info
* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)
Raw data
{
"_id": null,
"home_page": null,
"name": "openinference-instrumentation-openai",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.14,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "OpenInference Authors <oss@arize.com>",
"download_url": "https://files.pythonhosted.org/packages/76/13/56b4c978a1e29eb30812c8d432277ea0e90e58cc6f183dbcc7b780a1f509/openinference_instrumentation_openai-0.1.19.tar.gz",
"platform": null,
"description": "# OpenInference OpenAI Instrumentation\n\n[](https://pypi.org/project/openinference-instrumentation-openai/)\n\nPython auto-instrumentation library for OpenAI's python SDK.\n\nThe traces emitted by this instrumentation are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [`arize-phoenix`](https://github.com/Arize-ai/phoenix)\n\n## Installation\n\n```shell\npip install openinference-instrumentation-openai\n```\n\n## Quickstart\n\nIn this example we will instrument a small program that uses OpenAI and observe the traces via [`arize-phoenix`](https://github.com/Arize-ai/phoenix).\n\nInstall packages.\n\n```shell\npip install openinference-instrumentation-openai \"openai>=1.26\" arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp\n```\n\nStart the phoenix server so that it is ready to collect traces.\nThe Phoenix server runs entirely on your machine and does not send data over the internet.\n\n```shell\npython -m phoenix.server.main serve\n```\n\nIn a python file, setup the `OpenAIInstrumentor` and configure the tracer to send traces to Phoenix.\n\n```python\nimport openai\nfrom openinference.instrumentation.openai import OpenAIInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import ConsoleSpanExporter, SimpleSpanProcessor\n\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n# Optionally, you can also print the spans to the console.\ntracer_provider.add_span_processor(SimpleSpanProcessor(ConsoleSpanExporter()))\n\nOpenAIInstrumentor().instrument(tracer_provider=tracer_provider)\n\n\nif __name__ == \"__main__\":\n client = openai.OpenAI()\n response = client.chat.completions.create(\n model=\"gpt-3.5-turbo\",\n messages=[{\"role\": \"user\", \"content\": \"Write a haiku.\"}],\n max_tokens=20,\n stream=True,\n stream_options={\"include_usage\": True},\n )\n for chunk in response:\n if chunk.choices and (content := chunk.choices[0].delta.content):\n print(content, end=\"\")\n```\n\nSince we are using OpenAI, we must set the `OPENAI_API_KEY` environment variable to authenticate with the OpenAI API.\n\n```shell\nexport OPENAI_API_KEY=your-api-key\n```\n\nNow simply run the python file and observe the traces in Phoenix.\n\n```shell\npython your_file.py\n```\n\n## FAQ\n**Q: How to get token counts when streaming?**\n\n**A:** To get token counts when streaming, install `openai>=1.26` and set `stream_options={\"include_usage\": True}` when calling `create`. See the example shown above. For more info, see [here](https://community.openai.com/t/usage-stats-now-available-when-using-streaming-with-the-chat-completions-api-or-completions-api/738156).\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
"bugtrack_url": null,
"license": null,
"summary": "OpenInference OpenAI Instrumentation",
"version": "0.1.19",
"project_urls": {
"Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-openai"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "fa1cc9d5821b018416bfecf0d8d2b7e4e727e2bcea854d1db96ef95d351604e2",
"md5": "c6734ee38bf65c5c4114993a6738ed2a",
"sha256": "625f00e77b0ae568477b203d0a697b7d5c9dae3d4666ad65366c7ac184f45b71"
},
"downloads": -1,
"filename": "openinference_instrumentation_openai-0.1.19-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c6734ee38bf65c5c4114993a6738ed2a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.14,>=3.9",
"size": 23967,
"upload_time": "2025-02-04T23:10:53",
"upload_time_iso_8601": "2025-02-04T23:10:53.460600Z",
"url": "https://files.pythonhosted.org/packages/fa/1c/c9d5821b018416bfecf0d8d2b7e4e727e2bcea854d1db96ef95d351604e2/openinference_instrumentation_openai-0.1.19-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "761356b4c978a1e29eb30812c8d432277ea0e90e58cc6f183dbcc7b780a1f509",
"md5": "b10c0d67501c90c88e99e7e9414ba24e",
"sha256": "ac5ed4698fe1e40f9f3042111490288b49d9df73554400317d0aca8953cb4fa1"
},
"downloads": -1,
"filename": "openinference_instrumentation_openai-0.1.19.tar.gz",
"has_sig": false,
"md5_digest": "b10c0d67501c90c88e99e7e9414ba24e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.14,>=3.9",
"size": 17949,
"upload_time": "2025-02-04T23:11:09",
"upload_time_iso_8601": "2025-02-04T23:11:09.141647Z",
"url": "https://files.pythonhosted.org/packages/76/13/56b4c978a1e29eb30812c8d432277ea0e90e58cc6f183dbcc7b780a1f509/openinference_instrumentation_openai-0.1.19.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-04 23:11:09",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Arize-ai",
"github_project": "openinference",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "openinference-instrumentation-openai"
}