comet-llm


Namecomet-llm JSON
Version 2.2.6 PyPI version JSON
download
home_pagehttps://www.comet.com
SummaryComet logger for LLM
upload_time2024-07-17 10:23:16
maintainerNone
docs_urlNone
authorComet ML Inc.
requires_python>=3.6
licenseMIT
keywords comet_llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
    <img alt="cometLLM" src="https://github.com/comet-ml/comet-llm/raw/main/logo.svg">
</p>
<p align="center">
    <a href="https://pypi.org/project/comet-llm">
        <img src="https://img.shields.io/pypi/v/comet-llm" alt="PyPI version">
    </a>
    <a rel="nofollow" href="https://opensource.org/license/mit/">
        <img alt="GitHub" src="https://img.shields.io/badge/License-MIT-blue.svg">
    </a>
    <a href="https://www.comet.com/docs/v2/guides/large-language-models/overview/" rel="nofollow">
        <img src="https://img.shields.io/badge/cometLLM-Docs-blue.svg" alt="cometLLM Documentation">
    </a>
    <a rel="nofollow" href="https://pepy.tech/project/comet-llm">
        <img style="max-width: 100%;" src="https://static.pepy.tech/badge/comet-llm" alt="Downloads">
    </a>
    <a rel="nofollow" href="https://colab.research.google.com/github/comet-ml/comet-llm/blob/main/examples/CometLLM_Prompts.ipynb">
        <img src="https://colab.research.google.com/assets/colab-badge.svg">
    </a>
</p>
<p align="center">
    <b>CometLLM</b> is a tool to log and visualize your LLM prompts and chains. Use CometLLM to identify effective prompt strategies, streamline your troubleshooting, and ensure reproducible workflows!
</p>
</p>

![CometLLM Preview](https://github.com/comet-ml/comet-llm/raw/main/comet_llm.gif)

## ⚡️ Quickstart

Install `comet_llm` Python library with pip:

```bash
pip install comet_llm
```

If you don't have already, [create your free Comet account](https://www.comet.com/signup/?utm_source=comet_llm&utm_medium=referral&utm_content=github&framework=llm) and grab your API Key from the account settings page.

Now you are all set to log your first prompt and response:

```python
import comet_llm

comet_llm.log_prompt(
    prompt="What is your name?",
    output=" My name is Alex.",
    api_key="<YOUR_COMET_API_KEY>",
)
```

## 🎯 Features

- [x] Log your prompts and responses, including prompt template, variables, timestamps and duration and any metadata that you need.
- [x] Visualize your prompts and responses in the UI.
- [x] Log your chain execution down to the level of granularity that you need.
- [x] Visualize your chain execution in the UI.
- [x] Automatically tracks your prompts when using the OpenAI chat models.
- [x] Track and analyze user feedback.
- [ ] Diff your prompts and chain execution in the UI.

## 👀 Examples

To log a single LLM call as an individual prompt, use `comet_llm.log_prompt`. If you require more granularity, you can log a chain of executions that may include more than one LLM call, context retrieval, or data pre- or post-processing with `comet_llm.start_chain`.

### Log a full prompt and response

```python
import comet_llm

comet_llm.log_prompt(
    prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
    prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
    prompt_template_variables={"question": "What is your name?"},
    metadata= {
        "usage.prompt_tokens": 7,
        "usage.completion_tokens": 5,
        "usage.total_tokens": 12,
    },
    output=" My name is Alex.",
    duration=16.598,
)
```

[Read the full documentation for more details about logging a prompt](https://www.comet.com/docs/v2/guides/large-language-models/llm-project/#logging-prompts-to-llm-projects).

### Log a LLM chain

```python
from comet_llm import Span, end_chain, start_chain
import datetime
from time import sleep


def retrieve_context(user_question):
    if "open" in user_question:
        return "Opening hours: 08:00 to 17:00 all days"


def llm_answering(user_question, current_time, context):
    prompt_template = """You are a helpful chatbot. You have access to the following context:
    {context}
    The current time is: {current_time}
    Analyze the following user question and decide if you can answer it, if the question can't be answered, say \"I don't know\":
    {user_question}
    """

    prompt = prompt_template.format(
        user_question=user_question, current_time=current_time, context=context
    )

    with Span(
        category="llm-call",
        inputs={"prompt_template": prompt_template, "prompt": prompt},
    ) as span:
        # Call your LLM model here
        sleep(0.1)
        result = "Yes we are currently open"
        usage = {"prompt_tokens": 52, "completion_tokens": 12, "total_tokens": 64}

        span.set_outputs(outputs={"result": result}, metadata={"usage": usage})

    return result


def main(user_question, current_time):
    start_chain(inputs={"user_question": user_question, "current_time": current_time})

    with Span(
        category="context-retrieval",
        name="Retrieve Context",
        inputs={"user_question": user_question},
    ) as span:
        context = retrieve_context(user_question)

        span.set_outputs(outputs={"context": context})

    with Span(
        category="llm-reasoning",
        inputs={
            "user_question": user_question,
            "current_time": current_time,
            "context": context,
        },
    ) as span:
        result = llm_answering(user_question, current_time, context)

        span.set_outputs(outputs={"result": result})

    end_chain(outputs={"result": result})


main("Are you open?", str(datetime.datetime.now().time()))
```

[Read the full documentation for more details about logging a chain](https://www.comet.com/docs/v2/guides/large-language-models/llm-project/#logging-chains-to-llm-projects).

## ⚙️ Configuration

You can configure your Comet credentials and where you are logging data to:

| Name                 | Python parameter name | Environment variable name |
| -------------------- | --------------------- | ------------------------- |
| Comet API KEY        | api_key               | COMET_API_KEY             |
| Comet Workspace name | workspace             | COMET_WORKSPACE           |
| Comet Project name   | project               | COMET_PROJECT_NAME        |

## 📝 License

Copyright (c) [Comet](https://www.comet.com/site/) 2023-present. `cometLLM` is free and open-source software licensed under the [MIT License](https://github.com/comet-ml/comet-llm/blob/master/LICENSE).

            

Raw data

            {
    "_id": null,
    "home_page": "https://www.comet.com",
    "name": "comet-llm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "comet_llm",
    "author": "Comet ML Inc.",
    "author_email": "mail@comet.com",
    "download_url": "https://files.pythonhosted.org/packages/97/ea/4f01c596fb38ed038e8135ccd63b63d9a653d6c1c0881e6aafe805caf690/comet_llm-2.2.6.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n    <img alt=\"cometLLM\" src=\"https://github.com/comet-ml/comet-llm/raw/main/logo.svg\">\n</p>\n<p align=\"center\">\n    <a href=\"https://pypi.org/project/comet-llm\">\n        <img src=\"https://img.shields.io/pypi/v/comet-llm\" alt=\"PyPI version\">\n    </a>\n    <a rel=\"nofollow\" href=\"https://opensource.org/license/mit/\">\n        <img alt=\"GitHub\" src=\"https://img.shields.io/badge/License-MIT-blue.svg\">\n    </a>\n    <a href=\"https://www.comet.com/docs/v2/guides/large-language-models/overview/\" rel=\"nofollow\">\n        <img src=\"https://img.shields.io/badge/cometLLM-Docs-blue.svg\" alt=\"cometLLM Documentation\">\n    </a>\n    <a rel=\"nofollow\" href=\"https://pepy.tech/project/comet-llm\">\n        <img style=\"max-width: 100%;\" src=\"https://static.pepy.tech/badge/comet-llm\" alt=\"Downloads\">\n    </a>\n    <a rel=\"nofollow\" href=\"https://colab.research.google.com/github/comet-ml/comet-llm/blob/main/examples/CometLLM_Prompts.ipynb\">\n        <img src=\"https://colab.research.google.com/assets/colab-badge.svg\">\n    </a>\n</p>\n<p align=\"center\">\n    <b>CometLLM</b> is a tool to log and visualize your LLM prompts and chains. Use CometLLM to identify effective prompt strategies, streamline your troubleshooting, and ensure reproducible workflows!\n</p>\n</p>\n\n![CometLLM Preview](https://github.com/comet-ml/comet-llm/raw/main/comet_llm.gif)\n\n## \u26a1\ufe0f Quickstart\n\nInstall `comet_llm` Python library with pip:\n\n```bash\npip install comet_llm\n```\n\nIf you don't have already, [create your free Comet account](https://www.comet.com/signup/?utm_source=comet_llm&utm_medium=referral&utm_content=github&framework=llm) and grab your API Key from the account settings page.\n\nNow you are all set to log your first prompt and response:\n\n```python\nimport comet_llm\n\ncomet_llm.log_prompt(\n    prompt=\"What is your name?\",\n    output=\" My name is Alex.\",\n    api_key=\"<YOUR_COMET_API_KEY>\",\n)\n```\n\n## \ud83c\udfaf Features\n\n- [x] Log your prompts and responses, including prompt template, variables, timestamps and duration and any metadata that you need.\n- [x] Visualize your prompts and responses in the UI.\n- [x] Log your chain execution down to the level of granularity that you need.\n- [x] Visualize your chain execution in the UI.\n- [x] Automatically tracks your prompts when using the OpenAI chat models.\n- [x] Track and analyze user feedback.\n- [ ] Diff your prompts and chain execution in the UI.\n\n## \ud83d\udc40 Examples\n\nTo log a single LLM call as an individual prompt, use `comet_llm.log_prompt`. If you require more granularity, you can log a chain of executions that may include more than one LLM call, context retrieval, or data pre- or post-processing with `comet_llm.start_chain`.\n\n### Log a full prompt and response\n\n```python\nimport comet_llm\n\ncomet_llm.log_prompt(\n    prompt=\"Answer the question and if the question can't be answered, say \\\"I don't know\\\"\\n\\n---\\n\\nQuestion: What is your name?\\nAnswer:\",\n    prompt_template=\"Answer the question and if the question can't be answered, say \\\"I don't know\\\"\\n\\n---\\n\\nQuestion: {{question}}?\\nAnswer:\",\n    prompt_template_variables={\"question\": \"What is your name?\"},\n    metadata= {\n        \"usage.prompt_tokens\": 7,\n        \"usage.completion_tokens\": 5,\n        \"usage.total_tokens\": 12,\n    },\n    output=\" My name is Alex.\",\n    duration=16.598,\n)\n```\n\n[Read the full documentation for more details about logging a prompt](https://www.comet.com/docs/v2/guides/large-language-models/llm-project/#logging-prompts-to-llm-projects).\n\n### Log a LLM chain\n\n```python\nfrom comet_llm import Span, end_chain, start_chain\nimport datetime\nfrom time import sleep\n\n\ndef retrieve_context(user_question):\n    if \"open\" in user_question:\n        return \"Opening hours: 08:00 to 17:00 all days\"\n\n\ndef llm_answering(user_question, current_time, context):\n    prompt_template = \"\"\"You are a helpful chatbot. You have access to the following context:\n    {context}\n    The current time is: {current_time}\n    Analyze the following user question and decide if you can answer it, if the question can't be answered, say \\\"I don't know\\\":\n    {user_question}\n    \"\"\"\n\n    prompt = prompt_template.format(\n        user_question=user_question, current_time=current_time, context=context\n    )\n\n    with Span(\n        category=\"llm-call\",\n        inputs={\"prompt_template\": prompt_template, \"prompt\": prompt},\n    ) as span:\n        # Call your LLM model here\n        sleep(0.1)\n        result = \"Yes we are currently open\"\n        usage = {\"prompt_tokens\": 52, \"completion_tokens\": 12, \"total_tokens\": 64}\n\n        span.set_outputs(outputs={\"result\": result}, metadata={\"usage\": usage})\n\n    return result\n\n\ndef main(user_question, current_time):\n    start_chain(inputs={\"user_question\": user_question, \"current_time\": current_time})\n\n    with Span(\n        category=\"context-retrieval\",\n        name=\"Retrieve Context\",\n        inputs={\"user_question\": user_question},\n    ) as span:\n        context = retrieve_context(user_question)\n\n        span.set_outputs(outputs={\"context\": context})\n\n    with Span(\n        category=\"llm-reasoning\",\n        inputs={\n            \"user_question\": user_question,\n            \"current_time\": current_time,\n            \"context\": context,\n        },\n    ) as span:\n        result = llm_answering(user_question, current_time, context)\n\n        span.set_outputs(outputs={\"result\": result})\n\n    end_chain(outputs={\"result\": result})\n\n\nmain(\"Are you open?\", str(datetime.datetime.now().time()))\n```\n\n[Read the full documentation for more details about logging a chain](https://www.comet.com/docs/v2/guides/large-language-models/llm-project/#logging-chains-to-llm-projects).\n\n## \u2699\ufe0f Configuration\n\nYou can configure your Comet credentials and where you are logging data to:\n\n| Name                 | Python parameter name | Environment variable name |\n| -------------------- | --------------------- | ------------------------- |\n| Comet API KEY        | api_key               | COMET_API_KEY             |\n| Comet Workspace name | workspace             | COMET_WORKSPACE           |\n| Comet Project name   | project               | COMET_PROJECT_NAME        |\n\n## \ud83d\udcdd License\n\nCopyright (c) [Comet](https://www.comet.com/site/) 2023-present. `cometLLM` is free and open-source software licensed under the [MIT License](https://github.com/comet-ml/comet-llm/blob/master/LICENSE).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Comet logger for LLM",
    "version": "2.2.6",
    "project_urls": {
        "Homepage": "https://www.comet.com",
        "Source code": "https://github.com/comet-ml/comet-llm"
    },
    "split_keywords": [
        "comet_llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b1e6b8573c463c7170ffba9165381a7f64b9732cfaa2c9fffa3214b943ae8883",
                "md5": "f5c2657815230b07a6a666e640e4087f",
                "sha256": "2b45e3a7a0f55cd679e010c49d720608fcbe5340208fa2e8dcb6ea02ac1799f2"
            },
            "downloads": -1,
            "filename": "comet_llm-2.2.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f5c2657815230b07a6a666e640e4087f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 73166,
            "upload_time": "2024-07-17T10:23:14",
            "upload_time_iso_8601": "2024-07-17T10:23:14.335701Z",
            "url": "https://files.pythonhosted.org/packages/b1/e6/b8573c463c7170ffba9165381a7f64b9732cfaa2c9fffa3214b943ae8883/comet_llm-2.2.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "97ea4f01c596fb38ed038e8135ccd63b63d9a653d6c1c0881e6aafe805caf690",
                "md5": "01760956a0f711519c12d227cff07f52",
                "sha256": "11bada17f4adcc515c4df447c05191664955763fc940f0798819243983cc6792"
            },
            "downloads": -1,
            "filename": "comet_llm-2.2.6.tar.gz",
            "has_sig": false,
            "md5_digest": "01760956a0f711519c12d227cff07f52",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 36947,
            "upload_time": "2024-07-17T10:23:16",
            "upload_time_iso_8601": "2024-07-17T10:23:16.616447Z",
            "url": "https://files.pythonhosted.org/packages/97/ea/4f01c596fb38ed038e8135ccd63b63d9a653d6c1c0881e6aafe805caf690/comet_llm-2.2.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-17 10:23:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "comet-ml",
    "github_project": "comet-llm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "comet-llm"
}
        
Elapsed time: 0.31262s