# Chat Component
A Reflex custom component chat.
## Installation
```bash
pip install reflex-chat
```
## Usage
See the `chat_demo` folder for an example app.
```python
import reflex as rx
from reflex_chat import chat, api
@rx.page()
def index() -> rx.Component:
return rx.container(
rx.box(
chat(process=api.openai()),
height="100vh",
),
size="2",
)
app = rx.App()
```
1. Import the `chat` component to your code.
```python
from reflex_chat import chat
```
2. Specify the `process` function that will be called every time the user submits a question on the chat box. The `process` function should be an async function that yields after appending parts of the streamed response.
We have a default `process` function that uses the OpenAI API to get the response. You can use it by importing the `api` module. Over time we will add more `process` functions into the library.
To use the OpenAI API, you need to set the `OPENAI_API_KEY` environment variable. You can specify the mdoel with the `OPENAI_MODEL` environment variable or pass it as an argument to the `api.openai()` function.
```python
chat(process=api.openai(model="gpt-3.5-turbo")),
```
See below on how to specify your own `process` function.
```python
3. Add the `chat` component to your page.
By default the component takes up the full width and height of the parent container. You can specify the width and height of the component by passing the `width` and `height` arguments to the `chat` component.
```python
@rx.page()
def index() -> rx.Component:
return rx.container(
rx.box(
chat(process=api.openai(model="gpt-3.5-turbo")),
height="100vh",
),
size="2",
)
```
## Accessing the Chat State
Once you create a chat component, you can access its state through the `chat.State` object.
Get the messages from the chat state.
```python
chat1 = chat()
@rx.page()
def index() -> rx.Component:
return rx.container(
# Get the messages through chat1.State.messages.
rx.text("Total Messages: ", chat1.State.messages.length()),
# Get the last user message through chat1.State.last_user_message.
rx.text(chat1.State.last_user_message),
rx.hstack(
chat1,
height="100vh",
),
# Get the processing state through chat1.State.processing.
background_color=rx.cond(chat1.State.processing, "gray", "white"),
size="4",
)
```
## Specifying your own process function
You can specify your own `process` function that will be called every time the user submits a question on the chat box. The `process` function should be an async function that takes in the current chat state and yields after appending parts of the streamed response.
The OpenAI `process` function is defined as below:
```python
async def process(chat: Chat):
# Start a new session to answer the question.
session = client.chat.completions.create(
model=model,
# Use chat.get_messages() to get the messages when processing.
messages=chat.get_messages(),
stream=True,
)
# Stream the results, yielding after every word.
for item in session:
delta = item.choices[0].delta.content
# Append to the last bot message (which defaults as an empty string).
chat.append_to_response(delta)
yield
return process
```
Raw data
{
"_id": null,
"home_page": null,
"name": "reflex-chat",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "reflex, reflex-custom-components",
"author": null,
"author_email": "Nikhil Rao <nikhil@reflex.dev>",
"download_url": "https://files.pythonhosted.org/packages/39/49/2b5732d61c539f51bdc7af14251e233d65cb86ad4b9de4ef9771e022cb67/reflex_chat-0.0.1.tar.gz",
"platform": null,
"description": "# Chat Component\n\nA Reflex custom component chat.\n\n## Installation\n\n```bash\npip install reflex-chat\n```\n\n## Usage\n\nSee the `chat_demo` folder for an example app.\n\n```python\nimport reflex as rx\nfrom reflex_chat import chat, api\n\n@rx.page()\ndef index() -> rx.Component:\n return rx.container(\n rx.box(\n chat(process=api.openai()),\n height=\"100vh\",\n ),\n size=\"2\",\n )\n\napp = rx.App()\n```\n\n1. Import the `chat` component to your code.\n\n```python\nfrom reflex_chat import chat\n```\n\n2. Specify the `process` function that will be called every time the user submits a question on the chat box. The `process` function should be an async function that yields after appending parts of the streamed response.\n\nWe have a default `process` function that uses the OpenAI API to get the response. You can use it by importing the `api` module. Over time we will add more `process` functions into the library.\n\nTo use the OpenAI API, you need to set the `OPENAI_API_KEY` environment variable. You can specify the mdoel with the `OPENAI_MODEL` environment variable or pass it as an argument to the `api.openai()` function.\n\n```python\nchat(process=api.openai(model=\"gpt-3.5-turbo\")),\n```\n\nSee below on how to specify your own `process` function.\n\n```python\n\n3. Add the `chat` component to your page.\n\nBy default the component takes up the full width and height of the parent container. You can specify the width and height of the component by passing the `width` and `height` arguments to the `chat` component.\n\n```python\n@rx.page()\ndef index() -> rx.Component:\n return rx.container(\n rx.box(\n chat(process=api.openai(model=\"gpt-3.5-turbo\")),\n height=\"100vh\",\n ),\n size=\"2\",\n )\n```\n\n## Accessing the Chat State\n\nOnce you create a chat component, you can access its state through the `chat.State` object.\n\nGet the messages from the chat state.\n\n```python\nchat1 = chat()\n\n@rx.page()\ndef index() -> rx.Component:\n return rx.container(\n # Get the messages through chat1.State.messages.\n rx.text(\"Total Messages: \", chat1.State.messages.length()),\n # Get the last user message through chat1.State.last_user_message.\n rx.text(chat1.State.last_user_message),\n rx.hstack(\n chat1,\n height=\"100vh\",\n ),\n # Get the processing state through chat1.State.processing.\n background_color=rx.cond(chat1.State.processing, \"gray\", \"white\"),\n size=\"4\",\n )\n```\n\n## Specifying your own process function\n\nYou can specify your own `process` function that will be called every time the user submits a question on the chat box. The `process` function should be an async function that takes in the current chat state and yields after appending parts of the streamed response.\n\nThe OpenAI `process` function is defined as below:\n\n```python\nasync def process(chat: Chat):\n # Start a new session to answer the question.\n session = client.chat.completions.create(\n model=model,\n # Use chat.get_messages() to get the messages when processing.\n messages=chat.get_messages(),\n stream=True,\n )\n\n # Stream the results, yielding after every word.\n for item in session:\n delta = item.choices[0].delta.content\n # Append to the last bot message (which defaults as an empty string).\n chat.append_to_response(delta)\n yield\n\nreturn process\n```\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Reflex custom component chat",
"version": "0.0.1",
"project_urls": {
"homepage": "https://github.com/picklelo/reflex-chat",
"source": "https://github.com/picklelo/reflex-chat"
},
"split_keywords": [
"reflex",
" reflex-custom-components"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "73f0d6026c53b33362268388fd850838098a6c689f440640d31511b1b491d8b5",
"md5": "95e3a3c6c43ed81452bc378a584c12cc",
"sha256": "ef86f60567be8a552d2fd1f60d5d5863935d07f4a512c59ae55c5acb8e048437"
},
"downloads": -1,
"filename": "reflex_chat-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "95e3a3c6c43ed81452bc378a584c12cc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 5772,
"upload_time": "2024-04-22T21:08:07",
"upload_time_iso_8601": "2024-04-22T21:08:07.597002Z",
"url": "https://files.pythonhosted.org/packages/73/f0/d6026c53b33362268388fd850838098a6c689f440640d31511b1b491d8b5/reflex_chat-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "39492b5732d61c539f51bdc7af14251e233d65cb86ad4b9de4ef9771e022cb67",
"md5": "4bf2c7127b856447eddb065d8767f7db",
"sha256": "58b01ee2c45327094ad3976d4caa9619b0d5a4565da58a3cfb06fec5de9f01f1"
},
"downloads": -1,
"filename": "reflex_chat-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "4bf2c7127b856447eddb065d8767f7db",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 5002,
"upload_time": "2024-04-22T21:08:10",
"upload_time_iso_8601": "2024-04-22T21:08:10.125321Z",
"url": "https://files.pythonhosted.org/packages/39/49/2b5732d61c539f51bdc7af14251e233d65cb86ad4b9de4ef9771e022cb67/reflex_chat-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-22 21:08:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "picklelo",
"github_project": "reflex-chat",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "reflex-chat"
}