grafana-openai-monitoring


Namegrafana-openai-monitoring JSON
Version 0.0.9 PyPI version JSON
download
home_pagehttps://github.com/grafana/grafana-openai-monitoring
SummaryLibrary to monitor your OpenAI usage and send metrics and logs to Grafana Cloud
upload_time2024-04-25 07:22:00
maintainerNone
docs_urlNone
authorIshan Jain
requires_python<4.0.0,>=3.7.1
licenseNone
keywords observability monitoring openai grafana gpt
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenAI Monitoring: Monitor OpenAI API Usage with Grafana Cloud
[![Grafana](https://img.shields.io/badge/grafana-%23F46800.svg?&logo=grafana&logoColor=white)](https://grafana.com)
[![GitHub Last Commit](https://img.shields.io/github/last-commit/grafana/grafana-openai-monitoring)](https://github.com/grafana/grafana-openai-monitoring/tags)
[![GitHub Contributors](https://img.shields.io/github/contributors/grafana/grafana-openai-monitoring)](https://github.com/grafana/grafana-openai-monitoring/tags)

[![Python Tests](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/python-tests.yml/badge.svg?branch=main)](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/python-tests.yml)
[![Pylint](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/pylint.yml/badge.svg?branch=main)](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/pylint.yml)

[grafana-openai-monitoring](https://pypi.org/project/grafana-openai-monitoring/) is a Python library that provides a decorators to monitor chat completions and Completions endpoints of the OpenAI API. It facilitates sending metrics and logs to **Grafana Cloud**, allowing you to track and analyze OpenAI API usage and responses.

## Installation
You can install [grafana-openai-monitoring](https://pypi.org/project/grafana-openai-monitoring/) using pip:

```bash
pip install grafana-openai-monitoring
```

## Usage

The following tables shows which OpenAI function correspons to which monitoing function in this library

| OpenAI Function        | Monitoring Function |
|------------------------|---------------------|
| ChatCompletion.create  | chat_v2.monitor    |
| Completion.create      | chat_v1.monitor    |

### ChatCompletions

To monitor ChatCompletions using the OpenAI API, you can use the `chat_v2.monitor` decorator. This decorator automatically tracks API calls and sends metrics and logs to the specified Grafana Cloud endpoints.

Here's how to set it up:

```python
from openai import OpenAI
from grafana_openai_monitoring import chat_v2

client = OpenAI(
    api_key="YOUR_OPENAI_API_KEY",
)

# Apply the custom decorator to the OpenAI API function. To use with AsyncOpenAI, Pass `use_async` = True in this function.
client.chat.completions.create = chat_v2.monitor(
    client.chat.completions.create,
    metrics_url="YOUR_PROMETHEUS_METRICS_URL",  # Example: "https://prometheus.grafana.net/api/prom"
    logs_url="YOUR_LOKI_LOGS_URL",  # Example: "https://logs.example.com/loki/api/v1/push/"
    metrics_username="YOUR_METRICS_USERNAME",  # Example: "123456"
    logs_username="YOUR_LOGS_USERNAME",  # Example: "987654"
    access_token="YOUR_ACCESS_TOKEN"  # Example: "glc_eyasdansdjnaxxxxxxxxxxx"
)

# Now any call to client.chat.completions.create will be automatically tracked
response = client.chat.completions.create(model="gpt-4", max_tokens=100, messages=[{"role": "user", "content": "What is Grafana?"}])
print(response)
```

### Completions

To monitor completions using the OpenAI API, you can use the `chat_v1.monitor` decorator. This decorator adds monitoring capabilities to the OpenAI API function and sends metrics and logs to the specified Grafana Cloud endpoints.

Here's how to apply it:

```python
from openai import OpenAI
from grafana_openai_monitoring import chat_v1

client = OpenAI(
    api_key="YOUR_OPENAI_API_KEY",
)

# Apply the custom decorator to the OpenAI API function
client.completions.create = chat_v1.monitor(
    client.completions.create,
    metrics_url="YOUR_PROMETHEUS_METRICS_URL",  # Example: "https://prometheus.grafana.net/api/prom"
    logs_url="YOUR_LOKI_LOGS_URL",  # Example: "https://logs.example.com/loki/api/v1/push/"
    metrics_username="YOUR_METRICS_USERNAME",  # Example: "123456"
    logs_username="YOUR_LOGS_USERNAME",  # Example: "987654"
    access_token="YOUR_ACCESS_TOKEN"  # Example: "glc_eyasdansdjnaxxxxxxxxxxx"
)

# Now any call to client.completions.create will be automatically tracked
response = client.completions.create(model="davinci", max_tokens=100, prompt="Isn't Grafana the best?")
print(response)
```

## Configuration
To use the grafana-openai-monitoring library effectively, you need to provide the following information:

- **YOUR_OPENAI_API_KEY**: Replace this with your actual OpenAI API key.
- **YOUR_PROMETHEUS_METRICS_URL**: Replace the URL with your Prometheus URL.
- **YOUR_LOKI_LOGS_URL**: Replace with the URL where you want to send Loki logs.
- **YOUR_METRICS_USERNAME**: Replace with the username for Prometheus.
- **YOUR_LOGS_USERNAME**: Replace with the username for Loki.
- **YOUR_ACCESS_TOKEN**: Replace with the [Cloud Access Policy token](https://grafana.com/docs/grafana-cloud/account-management/authentication-and-permissions/access-policies/) required for authentication.

After configuring the parameters, the monitored API function will automatically log and track the requests and responses to the specified endpoints.

## Compatibility
Python 3.7.1 and above

## Dependencies
- [OpenAI](https://pypi.org/project/openai/)
- [requests](https://pypi.org/project/requests/)

## License
This project is licensed under the  GPL-3.0 license - see the [LICENSE](LICENSE.txt) for details.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/grafana/grafana-openai-monitoring",
    "name": "grafana-openai-monitoring",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0.0,>=3.7.1",
    "maintainer_email": null,
    "keywords": "observability, monitoring, openai, grafana, gpt",
    "author": "Ishan Jain",
    "author_email": "ishan.jain@grafana.com",
    "download_url": "https://files.pythonhosted.org/packages/30/e4/e96a3fcc23803cb60daf2bedb1d2a3e4132514975c6f2bc9ac4723d0f180/grafana_openai_monitoring-0.0.9.tar.gz",
    "platform": null,
    "description": "# OpenAI Monitoring: Monitor OpenAI API Usage with Grafana Cloud\n[![Grafana](https://img.shields.io/badge/grafana-%23F46800.svg?&logo=grafana&logoColor=white)](https://grafana.com)\n[![GitHub Last Commit](https://img.shields.io/github/last-commit/grafana/grafana-openai-monitoring)](https://github.com/grafana/grafana-openai-monitoring/tags)\n[![GitHub Contributors](https://img.shields.io/github/contributors/grafana/grafana-openai-monitoring)](https://github.com/grafana/grafana-openai-monitoring/tags)\n\n[![Python Tests](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/python-tests.yml/badge.svg?branch=main)](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/python-tests.yml)\n[![Pylint](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/pylint.yml/badge.svg?branch=main)](https://github.com/grafana/grafana-openai-monitoring/actions/workflows/pylint.yml)\n\n[grafana-openai-monitoring](https://pypi.org/project/grafana-openai-monitoring/) is a Python library that provides a decorators to monitor chat completions and Completions endpoints of the OpenAI API. It facilitates sending metrics and logs to **Grafana Cloud**, allowing you to track and analyze OpenAI API usage and responses.\n\n## Installation\nYou can install [grafana-openai-monitoring](https://pypi.org/project/grafana-openai-monitoring/) using pip:\n\n```bash\npip install grafana-openai-monitoring\n```\n\n## Usage\n\nThe following tables shows which OpenAI function correspons to which monitoing function in this library\n\n| OpenAI Function        | Monitoring Function |\n|------------------------|---------------------|\n| ChatCompletion.create  | chat_v2.monitor    |\n| Completion.create      | chat_v1.monitor    |\n\n### ChatCompletions\n\nTo monitor ChatCompletions using the OpenAI API, you can use the `chat_v2.monitor` decorator. This decorator automatically tracks API calls and sends metrics and logs to the specified Grafana Cloud endpoints.\n\nHere's how to set it up:\n\n```python\nfrom openai import OpenAI\nfrom grafana_openai_monitoring import chat_v2\n\nclient = OpenAI(\n    api_key=\"YOUR_OPENAI_API_KEY\",\n)\n\n# Apply the custom decorator to the OpenAI API function. To use with AsyncOpenAI, Pass `use_async` = True in this function.\nclient.chat.completions.create = chat_v2.monitor(\n    client.chat.completions.create,\n    metrics_url=\"YOUR_PROMETHEUS_METRICS_URL\",  # Example: \"https://prometheus.grafana.net/api/prom\"\n    logs_url=\"YOUR_LOKI_LOGS_URL\",  # Example: \"https://logs.example.com/loki/api/v1/push/\"\n    metrics_username=\"YOUR_METRICS_USERNAME\",  # Example: \"123456\"\n    logs_username=\"YOUR_LOGS_USERNAME\",  # Example: \"987654\"\n    access_token=\"YOUR_ACCESS_TOKEN\"  # Example: \"glc_eyasdansdjnaxxxxxxxxxxx\"\n)\n\n# Now any call to client.chat.completions.create will be automatically tracked\nresponse = client.chat.completions.create(model=\"gpt-4\", max_tokens=100, messages=[{\"role\": \"user\", \"content\": \"What is Grafana?\"}])\nprint(response)\n```\n\n### Completions\n\nTo monitor completions using the OpenAI API, you can use the `chat_v1.monitor` decorator. This decorator adds monitoring capabilities to the OpenAI API function and sends metrics and logs to the specified Grafana Cloud endpoints.\n\nHere's how to apply it:\n\n```python\nfrom openai import OpenAI\nfrom grafana_openai_monitoring import chat_v1\n\nclient = OpenAI(\n    api_key=\"YOUR_OPENAI_API_KEY\",\n)\n\n# Apply the custom decorator to the OpenAI API function\nclient.completions.create = chat_v1.monitor(\n    client.completions.create,\n    metrics_url=\"YOUR_PROMETHEUS_METRICS_URL\",  # Example: \"https://prometheus.grafana.net/api/prom\"\n    logs_url=\"YOUR_LOKI_LOGS_URL\",  # Example: \"https://logs.example.com/loki/api/v1/push/\"\n    metrics_username=\"YOUR_METRICS_USERNAME\",  # Example: \"123456\"\n    logs_username=\"YOUR_LOGS_USERNAME\",  # Example: \"987654\"\n    access_token=\"YOUR_ACCESS_TOKEN\"  # Example: \"glc_eyasdansdjnaxxxxxxxxxxx\"\n)\n\n# Now any call to client.completions.create will be automatically tracked\nresponse = client.completions.create(model=\"davinci\", max_tokens=100, prompt=\"Isn't Grafana the best?\")\nprint(response)\n```\n\n## Configuration\nTo use the grafana-openai-monitoring library effectively, you need to provide the following information:\n\n- **YOUR_OPENAI_API_KEY**: Replace this with your actual OpenAI API key.\n- **YOUR_PROMETHEUS_METRICS_URL**: Replace the URL with your Prometheus URL.\n- **YOUR_LOKI_LOGS_URL**: Replace with the URL where you want to send Loki logs.\n- **YOUR_METRICS_USERNAME**: Replace with the username for Prometheus.\n- **YOUR_LOGS_USERNAME**: Replace with the username for Loki.\n- **YOUR_ACCESS_TOKEN**: Replace with the [Cloud Access Policy token](https://grafana.com/docs/grafana-cloud/account-management/authentication-and-permissions/access-policies/) required for authentication.\n\nAfter configuring the parameters, the monitored API function will automatically log and track the requests and responses to the specified endpoints.\n\n## Compatibility\nPython 3.7.1 and above\n\n## Dependencies\n- [OpenAI](https://pypi.org/project/openai/)\n- [requests](https://pypi.org/project/requests/)\n\n## License\nThis project is licensed under the  GPL-3.0 license - see the [LICENSE](LICENSE.txt) for details.\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Library to monitor your OpenAI usage and send metrics and logs to Grafana Cloud",
    "version": "0.0.9",
    "project_urls": {
        "Homepage": "https://github.com/grafana/grafana-openai-monitoring",
        "Repository": "https://github.com/grafana/grafana-openai-monitoring"
    },
    "split_keywords": [
        "observability",
        " monitoring",
        " openai",
        " grafana",
        " gpt"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "482a0e04ba409e97a3f52e682a498e5b89cf64224999e308bc6fbe254f033f60",
                "md5": "8389974adeccc13408c29061024178a3",
                "sha256": "5ba152c46b7d256b1f5f9456aea98ec0dd8f713b969363b9ef73a2087c20345a"
            },
            "downloads": -1,
            "filename": "grafana_openai_monitoring-0.0.9-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8389974adeccc13408c29061024178a3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0.0,>=3.7.1",
            "size": 8190,
            "upload_time": "2024-04-25T07:21:59",
            "upload_time_iso_8601": "2024-04-25T07:21:59.289179Z",
            "url": "https://files.pythonhosted.org/packages/48/2a/0e04ba409e97a3f52e682a498e5b89cf64224999e308bc6fbe254f033f60/grafana_openai_monitoring-0.0.9-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "30e4e96a3fcc23803cb60daf2bedb1d2a3e4132514975c6f2bc9ac4723d0f180",
                "md5": "b348a0484a8b44ff9d8ba3ab071e9b31",
                "sha256": "92022d2a04dd6f7ede4237ad136642c606df1dede81253a8954acc128a8eeb2a"
            },
            "downloads": -1,
            "filename": "grafana_openai_monitoring-0.0.9.tar.gz",
            "has_sig": false,
            "md5_digest": "b348a0484a8b44ff9d8ba3ab071e9b31",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0.0,>=3.7.1",
            "size": 6709,
            "upload_time": "2024-04-25T07:22:00",
            "upload_time_iso_8601": "2024-04-25T07:22:00.468359Z",
            "url": "https://files.pythonhosted.org/packages/30/e4/e96a3fcc23803cb60daf2bedb1d2a3e4132514975c6f2bc9ac4723d0f180/grafana_openai_monitoring-0.0.9.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-25 07:22:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "grafana",
    "github_project": "grafana-openai-monitoring",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "grafana-openai-monitoring"
}
        
Elapsed time: 3.17670s