openpipe


Nameopenpipe JSON
Version 4.11.0 PyPI version JSON
download
home_pagehttps://github.com/OpenPipe/OpenPipe
SummaryPython client library for the OpenPipe service
upload_time2024-05-03 01:58:12
maintainerNone
docs_urlNone
authorKyle Corbitt
requires_python<4.0,>=3.8
licenseApache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenPipe Python Client

This client allows you automatically report your OpenAI calls to [OpenPipe](https://openpipe.ai/).

## Installation

`pip install openpipe`

## Usage

1. Create a project at https://app.openpipe.ai
2. Find your project's API key at https://app.openpipe.ai/settings
3. Configure the OpenPipe client as shown below.

```python
from openpipe import OpenAI
import os

client = OpenAI(
    # defaults to os.environ.get("OPENAI_API_KEY")
    api_key="My API Key",
    openpipe={
        # Set the OpenPipe API key you got in step (2) above.
        # If you have the `OPENPIPE_API_KEY` environment variable set we'll read from it by default
        "api_key": "My OpenPipe API Key",
    }
)
```

You can now use your new OpenAI client, which functions identically to the generic OpenAI client while also reporting calls to your OpenPipe instance.

## Special Features

### Tagging

OpenPipe has a concept of "tagging." This is very useful for grouping a certain set of completions together. When you're using a dataset for fine-tuning, you can select all the prompts that match a certain set of tags. Here's how you can use the tagging feature:

```python
completion = client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[{"role": "system", "content": "count to 10"}],
    openpipe={
        "tags": {"prompt_id": "counting"},
        "log_request": True, # Enable/disable data collection. Defaults to True.
    },
)
```

#### Should I Wait to Enable Logging?

We recommend keeping request logging turned on from the beginning. If you change your prompt you can just set a new `prompt_id` tag so you can select just the latest version when you're ready to create a dataset.

## Usage with langchain

> Assuming you have created a project and have the openpipe key.

```python
from openpipe.langchain_llm import ChatOpenAI
from langchain.prompts import ChatPromptTemplate
from langchain.schema.runnable import RunnableSequence

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "Classify user query into positive, negative or neutral.",
        ),
        ("human", "{query}"),
    ]
)
llm = ChatOpenAI(model="gpt-3.5-turbo")\
    .with_tags(chain_name="classify", any_key="some")

# To provide the openpipe key explicitly
# llm = ChatOpenAI(model="gpt-3.5-turbo", openpipe_kwargs={"api_key": "My OpenPipe API Key"})\
#     .with_tags(chain_name="classify", any_key="some")

chain: RunnableSequence = prompt | llm
res = chain.invoke(
    {"query": "this is good"}
)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/OpenPipe/OpenPipe",
    "name": "openpipe",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": "Kyle Corbitt",
    "author_email": "kyle@openpipe.ai",
    "download_url": "https://files.pythonhosted.org/packages/13/2f/ee732b4b6565859324d55f3a1fb171ceaaf66414b0a456bdafcb50ea7c28/openpipe-4.11.0.tar.gz",
    "platform": null,
    "description": "# OpenPipe Python Client\n\nThis client allows you automatically report your OpenAI calls to [OpenPipe](https://openpipe.ai/).\n\n## Installation\n\n`pip install openpipe`\n\n## Usage\n\n1. Create a project at https://app.openpipe.ai\n2. Find your project's API key at https://app.openpipe.ai/settings\n3. Configure the OpenPipe client as shown below.\n\n```python\nfrom openpipe import OpenAI\nimport os\n\nclient = OpenAI(\n    # defaults to os.environ.get(\"OPENAI_API_KEY\")\n    api_key=\"My API Key\",\n    openpipe={\n        # Set the OpenPipe API key you got in step (2) above.\n        # If you have the `OPENPIPE_API_KEY` environment variable set we'll read from it by default\n        \"api_key\": \"My OpenPipe API Key\",\n    }\n)\n```\n\nYou can now use your new OpenAI client, which functions identically to the generic OpenAI client while also reporting calls to your OpenPipe instance.\n\n## Special Features\n\n### Tagging\n\nOpenPipe has a concept of \"tagging.\" This is very useful for grouping a certain set of completions together. When you're using a dataset for fine-tuning, you can select all the prompts that match a certain set of tags. Here's how you can use the tagging feature:\n\n```python\ncompletion = client.chat.completions.create(\n    model=\"gpt-3.5-turbo\",\n    messages=[{\"role\": \"system\", \"content\": \"count to 10\"}],\n    openpipe={\n        \"tags\": {\"prompt_id\": \"counting\"},\n        \"log_request\": True, # Enable/disable data collection. Defaults to True.\n    },\n)\n```\n\n#### Should I Wait to Enable Logging?\n\nWe recommend keeping request logging turned on from the beginning. If you change your prompt you can just set a new `prompt_id` tag so you can select just the latest version when you're ready to create a dataset.\n\n## Usage with langchain\n\n> Assuming you have created a project and have the openpipe key.\n\n```python\nfrom openpipe.langchain_llm import ChatOpenAI\nfrom langchain.prompts import ChatPromptTemplate\nfrom langchain.schema.runnable import RunnableSequence\n\nprompt = ChatPromptTemplate.from_messages(\n    [\n        (\n            \"system\",\n            \"Classify user query into positive, negative or neutral.\",\n        ),\n        (\"human\", \"{query}\"),\n    ]\n)\nllm = ChatOpenAI(model=\"gpt-3.5-turbo\")\\\n    .with_tags(chain_name=\"classify\", any_key=\"some\")\n\n# To provide the openpipe key explicitly\n# llm = ChatOpenAI(model=\"gpt-3.5-turbo\", openpipe_kwargs={\"api_key\": \"My OpenPipe API Key\"})\\\n#     .with_tags(chain_name=\"classify\", any_key=\"some\")\n\nchain: RunnableSequence = prompt | llm\nres = chain.invoke(\n    {\"query\": \"this is good\"}\n)\n```\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Python client library for the OpenPipe service",
    "version": "4.11.0",
    "project_urls": {
        "Homepage": "https://github.com/OpenPipe/OpenPipe",
        "Repository": "https://github.com/OpenPipe/OpenPipe"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5fc4eaf893968cadb446774363a7ea0ac9b6c8b0b58012f1912e27cb7e6e5907",
                "md5": "d5d01713cda58de4b095207ca94ec850",
                "sha256": "299fc24e3b90c2f027a55af5aaaedd39ccfe6c0cfb119190f795880f3a876a65"
            },
            "downloads": -1,
            "filename": "openpipe-4.11.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d5d01713cda58de4b095207ca94ec850",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 139466,
            "upload_time": "2024-05-03T01:58:10",
            "upload_time_iso_8601": "2024-05-03T01:58:10.970057Z",
            "url": "https://files.pythonhosted.org/packages/5f/c4/eaf893968cadb446774363a7ea0ac9b6c8b0b58012f1912e27cb7e6e5907/openpipe-4.11.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "132fee732b4b6565859324d55f3a1fb171ceaaf66414b0a456bdafcb50ea7c28",
                "md5": "da764f25916d9e389f46c2054f36c4d9",
                "sha256": "22709a1c968af3d07032ef24f93bbbedbd827b0a9ab7b776bcaa98561fe5b294"
            },
            "downloads": -1,
            "filename": "openpipe-4.11.0.tar.gz",
            "has_sig": false,
            "md5_digest": "da764f25916d9e389f46c2054f36c4d9",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 43855,
            "upload_time": "2024-05-03T01:58:12",
            "upload_time_iso_8601": "2024-05-03T01:58:12.690935Z",
            "url": "https://files.pythonhosted.org/packages/13/2f/ee732b4b6565859324d55f3a1fb171ceaaf66414b0a456bdafcb50ea7c28/openpipe-4.11.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-03 01:58:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "OpenPipe",
    "github_project": "OpenPipe",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openpipe"
}
        
Elapsed time: 0.24650s