bike4py


Namebike4py JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryBike4Mind API client
upload_time2025-02-18 02:59:08
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords api bike4mind bike4py chat client llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # bike4py

Bike4py is a Python client for the Bike4Mind API.

## Installation

```bash
pip install bike4py
```

## Usage

To use the client, you need to create a client with your refresh token:

```python
from bike4py import LLMClient

client = LLMClient(
    refresh_token="your_refresh_token"
)
```

Once you've created a client, you should connect to the websocket - this is
where responses to your prompt will be sent:

```python
# Connect to websocket, needed to see LLM responses
await client.connect()
```

You can then submit a prompt and stream the response.  The notebook ID can be found in the URL of the notebook you want to use.

```python
# Submit a prompt
request = ChatCompletionRequest(
  sessionId="%your_notebook_id%",
  message="Hello, how are you?"
)
response = client.submit_prompt(request)

# Stream the response
async for event in client.stream_events():
    if isinstance(event, StatusEvent):
        print(event.status)
    if isinstance(event, CompletionEvent):
        print(event.success)
    if isinstance(event, ContentEvent):
        print(event.content)
```

The event classes emitted by the `stream_events` method are:

- `StatusEvent`: Status updates from the LLM as Bike4Mind is processing the request.
- `ContentEvent`: While generating, the LLM will stream content in chunks; this event includes the accumulated progress of the response.
- `CompletionEvent`: At completion, this event is emitted with the final response.

The intent is that it's easy to separate the event types and process only the ones you need.  If you want streaming content for UI purposes, process the `ContentEvent` events.  If you want to know when the LLM has completed, process the `CompletionEvent` event.

## Upload a file

```python
file_id = client.upload_file("test.txt", "text/plain")
```

This will return a `file_id` which you can then pass to the `submit_prompt` method to include in a prompt.
```python
...
file_id = client.upload_file("test.txt", "text/plain")
request = ChatCompletionRequest(
  sessionId="%your_notebook_id%",
  message="Can you summarize this file for me?",
  fabFileIds=[file_id]
)
response = client.submit_prompt(request)
...
```
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "bike4py",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "api, bike4mind, bike4py, chat, client, llm",
    "author": null,
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/44/da/a659a9bbd477757bffdfd7cd403b8c1e6417f20e49e4525edccd45fb9ff8/bike4py-0.1.1.tar.gz",
    "platform": null,
    "description": "# bike4py\n\nBike4py is a Python client for the Bike4Mind API.\n\n## Installation\n\n```bash\npip install bike4py\n```\n\n## Usage\n\nTo use the client, you need to create a client with your refresh token:\n\n```python\nfrom bike4py import LLMClient\n\nclient = LLMClient(\n    refresh_token=\"your_refresh_token\"\n)\n```\n\nOnce you've created a client, you should connect to the websocket - this is\nwhere responses to your prompt will be sent:\n\n```python\n# Connect to websocket, needed to see LLM responses\nawait client.connect()\n```\n\nYou can then submit a prompt and stream the response.  The notebook ID can be found in the URL of the notebook you want to use.\n\n```python\n# Submit a prompt\nrequest = ChatCompletionRequest(\n  sessionId=\"%your_notebook_id%\",\n  message=\"Hello, how are you?\"\n)\nresponse = client.submit_prompt(request)\n\n# Stream the response\nasync for event in client.stream_events():\n    if isinstance(event, StatusEvent):\n        print(event.status)\n    if isinstance(event, CompletionEvent):\n        print(event.success)\n    if isinstance(event, ContentEvent):\n        print(event.content)\n```\n\nThe event classes emitted by the `stream_events` method are:\n\n- `StatusEvent`: Status updates from the LLM as Bike4Mind is processing the request.\n- `ContentEvent`: While generating, the LLM will stream content in chunks; this event includes the accumulated progress of the response.\n- `CompletionEvent`: At completion, this event is emitted with the final response.\n\nThe intent is that it's easy to separate the event types and process only the ones you need.  If you want streaming content for UI purposes, process the `ContentEvent` events.  If you want to know when the LLM has completed, process the `CompletionEvent` event.\n\n## Upload a file\n\n```python\nfile_id = client.upload_file(\"test.txt\", \"text/plain\")\n```\n\nThis will return a `file_id` which you can then pass to the `submit_prompt` method to include in a prompt.\n```python\n...\nfile_id = client.upload_file(\"test.txt\", \"text/plain\")\nrequest = ChatCompletionRequest(\n  sessionId=\"%your_notebook_id%\",\n  message=\"Can you summarize this file for me?\",\n  fabFileIds=[file_id]\n)\nresponse = client.submit_prompt(request)\n...\n```",
    "bugtrack_url": null,
    "license": null,
    "summary": "Bike4Mind API client",
    "version": "0.1.1",
    "project_urls": {
        "Homepage": "https://bike4mind.com",
        "Repository": "https://github.com/MillionOnMars/bike4py"
    },
    "split_keywords": [
        "api",
        " bike4mind",
        " bike4py",
        " chat",
        " client",
        " llm"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "29f4193819033ab1eca403a275cee552c67aa45596a1cb21d8d05c5548c343de",
                "md5": "253662ecd99c98309528514700460fbd",
                "sha256": "8195f6bfd493a62bdab2d34f2752134ad4e30c1ee8394ba94aa877335fdf202f"
            },
            "downloads": -1,
            "filename": "bike4py-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "253662ecd99c98309528514700460fbd",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 8770,
            "upload_time": "2025-02-18T02:59:07",
            "upload_time_iso_8601": "2025-02-18T02:59:07.644194Z",
            "url": "https://files.pythonhosted.org/packages/29/f4/193819033ab1eca403a275cee552c67aa45596a1cb21d8d05c5548c343de/bike4py-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "44daa659a9bbd477757bffdfd7cd403b8c1e6417f20e49e4525edccd45fb9ff8",
                "md5": "77fdd218449d4d2d7181cff16e02261c",
                "sha256": "50704bffee2ca8d21d8453ff7fa92d3d04ee4ac36dd365d567c0b48491eaf377"
            },
            "downloads": -1,
            "filename": "bike4py-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "77fdd218449d4d2d7181cff16e02261c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 6668,
            "upload_time": "2025-02-18T02:59:08",
            "upload_time_iso_8601": "2025-02-18T02:59:08.543216Z",
            "url": "https://files.pythonhosted.org/packages/44/da/a659a9bbd477757bffdfd7cd403b8c1e6417f20e49e4525edccd45fb9ff8/bike4py-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-18 02:59:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "MillionOnMars",
    "github_project": "bike4py",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "bike4py"
}
        
Elapsed time: 0.42524s