Name | openinference-instrumentation-groq JSON |
Version |
0.1.6
JSON |
| download |
home_page | None |
Summary | OpenInference Groq Instrumentation |
upload_time | 2025-02-04 23:10:59 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.14,>=3.9 |
license | None |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# OpenInference Groq Instrumentation
Python autoinstrumentation library for the [Groq](https://wow.groq.com/why-groq/) package
This package implements OpenInference tracing for both Groq and AsyncGroq clients.
These traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [Arize `phoenix`](https://github.com/Arize-ai/phoenix).
## Installation
```shell
pip install openinference-instrumentation-groq
```
## Quickstart
Through your *terminal*, install required packages.
```shell
pip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp
```
You can start Phoenix with the following terminal command:
```shell
python -m phoenix.server.main serve
````
By default, Phoenix listens on `http://localhost:6006`. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)
Try the following code in a *Python file*.
1. Set up `GroqInstrumentor` to trace your application and sends the traces to Phoenix.
2. Then, set your Groq API key as an environment variable.
3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at `http://localhost:6006`!
```python
import os
from groq import Groq
from openinference.instrumentation.groq import GroqInstrumentor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk import trace as trace_sdk
from opentelemetry.sdk.trace.export import SimpleSpanProcessor
# Configure GroqInstrumentor with Phoenix endpoint
endpoint = "http://127.0.0.1:6006/v1/traces"
tracer_provider = trace_sdk.TracerProvider()
tracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))
GroqInstrumentor().instrument(tracer_provider=tracer_provider)
os.environ["GROQ_API_KEY"] = "YOUR_KEY_HERE"
client = Groq()
chat_completion = client.chat.completions.create(
messages=[
{
"role": "user",
"content": "Explain the importance of low latency LLMs",
}
],
model="llama3-8b-8192",
)
if __name__ == "__main__":
print(chat_completion.choices[0].message.content)
```
Now, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the "Attributes" tab will provide you with in-depth information regarding execution!
## More Info
* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)
* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)
* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)
Raw data
{
"_id": null,
"home_page": null,
"name": "openinference-instrumentation-groq",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.14,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "OpenInference Authors <oss@arize.com>",
"download_url": "https://files.pythonhosted.org/packages/6c/ff/a868ccffa48ad8303eae7876d102f1df7c45b29572fc525751bc600c0285/openinference_instrumentation_groq-0.1.6.tar.gz",
"platform": null,
"description": "# OpenInference Groq Instrumentation\n\nPython autoinstrumentation library for the [Groq](https://wow.groq.com/why-groq/) package\n\nThis package implements OpenInference tracing for both Groq and AsyncGroq clients.\n\nThese traces are fully OpenTelemetry compatible and can be sent to an OpenTelemetry collector for viewing, such as [Arize `phoenix`](https://github.com/Arize-ai/phoenix).\n\n\n## Installation\n\n```shell\npip install openinference-instrumentation-groq\n```\n\n## Quickstart\n\nThrough your *terminal*, install required packages.\n\n```shell\npip install openinference-instrumentation-groq groq arize-phoenix opentelemetry-sdk opentelemetry-exporter-otlp\n```\n\nYou can start Phoenix with the following terminal command:\n```shell\npython -m phoenix.server.main serve\n````\nBy default, Phoenix listens on `http://localhost:6006`. You can visit the app via a browser at the same address. (Phoenix does not send data over the internet. It only operates locally on your machine.)\n\n\nTry the following code in a *Python file*.\n\n1. Set up `GroqInstrumentor` to trace your application and sends the traces to Phoenix. \n2. Then, set your Groq API key as an environment variable. \n3. Lastly, create a Groq client, make a request, then go see your results in Phoenix at `http://localhost:6006`!\n\n```python\nimport os\nfrom groq import Groq\nfrom openinference.instrumentation.groq import GroqInstrumentor\nfrom opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter\nfrom opentelemetry.sdk import trace as trace_sdk\nfrom opentelemetry.sdk.trace.export import SimpleSpanProcessor\n\n# Configure GroqInstrumentor with Phoenix endpoint\nendpoint = \"http://127.0.0.1:6006/v1/traces\"\ntracer_provider = trace_sdk.TracerProvider()\ntracer_provider.add_span_processor(SimpleSpanProcessor(OTLPSpanExporter(endpoint)))\n\nGroqInstrumentor().instrument(tracer_provider=tracer_provider)\n\nos.environ[\"GROQ_API_KEY\"] = \"YOUR_KEY_HERE\"\n\nclient = Groq()\n\nchat_completion = client.chat.completions.create(\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"Explain the importance of low latency LLMs\",\n }\n ],\n model=\"llama3-8b-8192\",\n)\n\nif __name__ == \"__main__\":\n print(chat_completion.choices[0].message.content)\n```\n\nNow, on the Phoenix UI on your browser, you should see the traces from your Groq application. Click on a trace, then the \"Attributes\" tab will provide you with in-depth information regarding execution!\n\n## More Info\n\n* [More info on OpenInference and Phoenix](https://docs.arize.com/phoenix)\n* [How to customize spans to track sessions, metadata, etc.](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#customizing-spans)\n* [How to account for private information and span payload customization](https://github.com/Arize-ai/openinference/tree/main/python/openinference-instrumentation#tracing-configuration)\n",
"bugtrack_url": null,
"license": null,
"summary": "OpenInference Groq Instrumentation",
"version": "0.1.6",
"project_urls": {
"Homepage": "https://github.com/Arize-ai/openinference/tree/main/python/instrumentation/openinference-instrumentation-groq"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "1eb7a71322139bc78833a69100b15b0fff6f06a37644fc5ddccc0e94e63bf1d9",
"md5": "449a1c7688d760fdde2af136c896629f",
"sha256": "e5f60a90322962a36930002ce81f22ee8bff5365f84975c810d81560dc708b46"
},
"downloads": -1,
"filename": "openinference_instrumentation_groq-0.1.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "449a1c7688d760fdde2af136c896629f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.14,>=3.9",
"size": 15369,
"upload_time": "2025-02-04T23:10:53",
"upload_time_iso_8601": "2025-02-04T23:10:53.794063Z",
"url": "https://files.pythonhosted.org/packages/1e/b7/a71322139bc78833a69100b15b0fff6f06a37644fc5ddccc0e94e63bf1d9/openinference_instrumentation_groq-0.1.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "6cffa868ccffa48ad8303eae7876d102f1df7c45b29572fc525751bc600c0285",
"md5": "eb338e0cc0741fb9d0990c6e17be2cf8",
"sha256": "4364f7dfa0320761d303ffc14c79c92252ec28c92305280eae5254918bed40e6"
},
"downloads": -1,
"filename": "openinference_instrumentation_groq-0.1.6.tar.gz",
"has_sig": false,
"md5_digest": "eb338e0cc0741fb9d0990c6e17be2cf8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.14,>=3.9",
"size": 11764,
"upload_time": "2025-02-04T23:10:59",
"upload_time_iso_8601": "2025-02-04T23:10:59.018221Z",
"url": "https://files.pythonhosted.org/packages/6c/ff/a868ccffa48ad8303eae7876d102f1df7c45b29572fc525751bc600c0285/openinference_instrumentation_groq-0.1.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-02-04 23:10:59",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Arize-ai",
"github_project": "openinference",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "openinference-instrumentation-groq"
}