Name | chatlab JSON |
Version |
1.1.1
JSON |
| download |
home_page | https://github.com/rgbkrk/chatlab |
Summary | Chat Plugin Experiments, Simplified. Create agents and give them superpowers in your notebooks. |
upload_time | 2023-11-21 02:32:21 |
maintainer | |
docs_url | None |
author | Kyle Kelley |
requires_python | >=3.9,<4.0 |
license | BSD-3-Clause |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# ChatLab
**Chat Experiments, Simplified**
💬🔬
ChatLab is a Python package that makes it easy to experiment with OpenAI's chat models. It provides a simple interface for chatting with the models and a way to register functions that can be called from the chat model.
Best yet, it's interactive in the notebook!
⚠️ NOTE: The following docs are for `main`. Check out [chatlab.dev](https://chatlab.dev) for the docs for the most recent release.
## Introduction
```python
import chatlab
import random
def flip_a_coin():
'''Returns heads or tails'''
return random.choice(['heads', 'tails'])
chat = chatlab.Chat()
chat.register(flip_a_coin)
await chat("Please flip a coin for me")
```
<details style="background:#DDE6ED;color:#27374D;padding:.5rem 1rem;borderRadius:5px">
<summary> 𝑓 Ran `flip_a_coin`
</summary>
<br />
Input:
```json
{}
```
Output:
```json
"tails"
```
</details>
```markdown
It landed on tails!
```
In the notebook, text will stream into a Markdown output and function inputs and outputs are a nice collapsible display, like with ChatGPT Plugins.
TODO: Include GIF/mp4 of this in action
### Installation
```bash
pip install chatlab
```
### Configuration
You'll need to set your `OPENAI_API_KEY` environment variable. You can find your API key on your [OpenAI account page](https://platform.openai.com/account/api-keys). I recommend setting it in an `.env` file when working locally.
On hosted environments like Noteable, set it in your Secrets to keep it safe from prying LLM eyes.
## What can `Chat`s enable _you_ to do?
💬
Where `Chat`s take it next level is with _Chat Functions_. You can
- declare a function
- register the function in your `Chat`
- watch as Chat Models call your functions!
You may recall this kind of behavior from [ChatGPT Plugins](https://noteable.io/chatgpt-plugin-for-notebook/). Now, you can take this even further with your own custom code.
As an example, let's give the large language models the ability to tell time.
```python
from datetime import datetime
from pytz import timezone, all_timezones, utc
from typing import Optional
from pydantic import BaseModel
def what_time(tz: Optional[str] = None):
'''Current time, defaulting to UTC'''
if tz is None:
pass
elif tz in all_timezones:
tz = timezone(tz)
else:
return 'Invalid timezone'
return datetime.now(tz).strftime('%I:%M %p')
class WhatTime(BaseModel):
tz: Optional[str] = None
```
Let's break this down.
`what_time` is the function we're going to provide access to. Its docstring forms the `description` for the model while the schema comes from the pydantic `BaseModel` called `WhatTime`.
```python
import chatlab
chat = chatlab.Chat()
# Register our function
chat.register(what_time, WhatTime)
```
After that, we can call `chat` with direct strings (which are turned into user messages) or using simple message makers from `chatlab` named `user` and `system`.
```python
await chat("What time is it?")
```
<details style="background:#DDE6ED;color:#27374D;padding:.5rem 1rem;borderRadius:5px">
<summary> 𝑓 Ran `what_time`
</summary>
<br />
Input:
```json
{}
```
Output:
```json
"11:19 AM"
```
</details>
```markdown
The current time is 11:19 AM.
```
## Interface
The `chatlab` package exports
### `Chat`
The `Chat` class is the main way to chat using OpenAI's models. It keeps a history of your chat in `Chat.messages`.
#### `Chat.submit`
`submit` is how you send all the currently built up messages over to OpenAI. Markdown output will display responses from the `assistant`.
```python
await chat.submit('What would a parent who says "I have to play zone defense" mean? ')
# Markdown response inline
chat.messages
```
```js
[{'role': 'user',
'content': 'What does a parent of three kids mean by "I have to play zone defense"?'},
{'role': 'assistant',
'content': 'When a parent of three kids says "I have to play zone defense," it means that they...
```
#### `Chat.register`
You can register functions with `Chat.register` to make them available to the chat model. The function's docstring becomes the description of the function while the schema is derived from the `pydantic.BaseModel` passed in.
```python
from pydantic import BaseModel
class WhatTime(BaseModel):
tz: Optional[str] = None
def what_time(tz: Optional[str] = None):
'''Current time, defaulting to UTC'''
if tz is None:
pass
elif tz in all_timezones:
tz = timezone(tz)
else:
return 'Invalid timezone'
return datetime.now(tz).strftime('%I:%M %p')
chat.register(what_time, WhatTime)
```
#### `Chat.messages`
The raw messages sent and received to OpenAI. If you hit a token limit, you can remove old messages from the list to make room for more.
```python
chat.messages = chat.messages[-100:]
```
### Messaging
#### `human`/`user`
These functions create a message from the user to the chat model.
```python
from chatlab import human
human("How are you?")
```
```json
{ "role": "user", "content": "How are you?" }
```
#### `narrate`/`system`
`system` messages, also called `narrate` in `chatlab`, allow you to steer the model in a direction. You can use these to provide context without being seen by the user. One common use is to include it as initial context for the conversation.
```python
from chatlab import narrate
narrate("You are a large bird")
```
```json
{ "role": "system", "content": "You are a large bird" }
```
## Development
This project uses poetry for dependency management. To get started, clone the repo and run
```bash
poetry install -E dev -E test
```
We use `ruff` and `mypy`.
## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Raw data
{
"_id": null,
"home_page": "https://github.com/rgbkrk/chatlab",
"name": "chatlab",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9,<4.0",
"maintainer_email": "",
"keywords": "",
"author": "Kyle Kelley",
"author_email": "rgbkrk@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/5d/2e/9822bfabda7724d8790a3cd921696f4aa8a8080137851a0119339c9e8d33/chatlab-1.1.1.tar.gz",
"platform": null,
"description": "# ChatLab\n\n**Chat Experiments, Simplified**\n\n\ud83d\udcac\ud83d\udd2c\n\nChatLab is a Python package that makes it easy to experiment with OpenAI's chat models. It provides a simple interface for chatting with the models and a way to register functions that can be called from the chat model.\n\nBest yet, it's interactive in the notebook!\n\n\u26a0\ufe0f NOTE: The following docs are for `main`. Check out [chatlab.dev](https://chatlab.dev) for the docs for the most recent release. \n\n## Introduction\n\n```python\nimport chatlab\nimport random\n\ndef flip_a_coin():\n '''Returns heads or tails'''\n return random.choice(['heads', 'tails'])\n\nchat = chatlab.Chat()\nchat.register(flip_a_coin)\n\nawait chat(\"Please flip a coin for me\")\n```\n\n<details style=\"background:#DDE6ED;color:#27374D;padding:.5rem 1rem;borderRadius:5px\">\n<summary> \ud835\udc53 Ran `flip_a_coin`\n</summary>\n<br />\n\nInput:\n\n```json\n{}\n```\n\nOutput:\n\n```json\n\"tails\"\n```\n\n</details>\n\n```markdown\nIt landed on tails!\n```\n\nIn the notebook, text will stream into a Markdown output and function inputs and outputs are a nice collapsible display, like with ChatGPT Plugins.\n\nTODO: Include GIF/mp4 of this in action\n\n### Installation\n\n```bash\npip install chatlab\n```\n\n### Configuration\n\nYou'll need to set your `OPENAI_API_KEY` environment variable. You can find your API key on your [OpenAI account page](https://platform.openai.com/account/api-keys). I recommend setting it in an `.env` file when working locally.\n\nOn hosted environments like Noteable, set it in your Secrets to keep it safe from prying LLM eyes.\n\n## What can `Chat`s enable _you_ to do?\n\n\ud83d\udcac\n\nWhere `Chat`s take it next level is with _Chat Functions_. You can\n\n- declare a function\n- register the function in your `Chat`\n- watch as Chat Models call your functions!\n\nYou may recall this kind of behavior from [ChatGPT Plugins](https://noteable.io/chatgpt-plugin-for-notebook/). Now, you can take this even further with your own custom code.\n\nAs an example, let's give the large language models the ability to tell time.\n\n```python\nfrom datetime import datetime\nfrom pytz import timezone, all_timezones, utc\nfrom typing import Optional\nfrom pydantic import BaseModel\n\ndef what_time(tz: Optional[str] = None):\n '''Current time, defaulting to UTC'''\n if tz is None:\n pass\n elif tz in all_timezones:\n tz = timezone(tz)\n else:\n return 'Invalid timezone'\n\n return datetime.now(tz).strftime('%I:%M %p')\n\nclass WhatTime(BaseModel):\n tz: Optional[str] = None\n```\n\nLet's break this down.\n\n`what_time` is the function we're going to provide access to. Its docstring forms the `description` for the model while the schema comes from the pydantic `BaseModel` called `WhatTime`.\n\n```python\nimport chatlab\n\nchat = chatlab.Chat()\n\n# Register our function\nchat.register(what_time, WhatTime)\n```\n\nAfter that, we can call `chat` with direct strings (which are turned into user messages) or using simple message makers from `chatlab` named `user` and `system`.\n\n```python\nawait chat(\"What time is it?\")\n```\n\n<details style=\"background:#DDE6ED;color:#27374D;padding:.5rem 1rem;borderRadius:5px\">\n<summary> \ud835\udc53 Ran `what_time`\n</summary>\n<br />\n\nInput:\n\n```json\n{}\n```\n\nOutput:\n\n```json\n\"11:19 AM\"\n```\n\n</details>\n\n```markdown\nThe current time is 11:19 AM.\n```\n\n## Interface\n\nThe `chatlab` package exports\n\n### `Chat`\n\nThe `Chat` class is the main way to chat using OpenAI's models. It keeps a history of your chat in `Chat.messages`.\n\n#### `Chat.submit`\n\n`submit` is how you send all the currently built up messages over to OpenAI. Markdown output will display responses from the `assistant`.\n\n```python\nawait chat.submit('What would a parent who says \"I have to play zone defense\" mean? ')\n# Markdown response inline\nchat.messages\n```\n\n```js\n[{'role': 'user',\n 'content': 'What does a parent of three kids mean by \"I have to play zone defense\"?'},\n {'role': 'assistant',\n 'content': 'When a parent of three kids says \"I have to play zone defense,\" it means that they...\n```\n\n#### `Chat.register`\n\nYou can register functions with `Chat.register` to make them available to the chat model. The function's docstring becomes the description of the function while the schema is derived from the `pydantic.BaseModel` passed in.\n\n```python\nfrom pydantic import BaseModel\n\nclass WhatTime(BaseModel):\n tz: Optional[str] = None\n\ndef what_time(tz: Optional[str] = None):\n '''Current time, defaulting to UTC'''\n if tz is None:\n pass\n elif tz in all_timezones:\n tz = timezone(tz)\n else:\n return 'Invalid timezone'\n\n return datetime.now(tz).strftime('%I:%M %p')\n\nchat.register(what_time, WhatTime)\n```\n\n#### `Chat.messages`\n\nThe raw messages sent and received to OpenAI. If you hit a token limit, you can remove old messages from the list to make room for more.\n\n```python\nchat.messages = chat.messages[-100:]\n```\n\n### Messaging\n\n#### `human`/`user`\n\nThese functions create a message from the user to the chat model.\n\n```python\nfrom chatlab import human\n\nhuman(\"How are you?\")\n```\n\n```json\n{ \"role\": \"user\", \"content\": \"How are you?\" }\n```\n\n#### `narrate`/`system`\n\n`system` messages, also called `narrate` in `chatlab`, allow you to steer the model in a direction. You can use these to provide context without being seen by the user. One common use is to include it as initial context for the conversation.\n\n```python\nfrom chatlab import narrate\n\nnarrate(\"You are a large bird\")\n```\n\n```json\n{ \"role\": \"system\", \"content\": \"You are a large bird\" }\n```\n\n## Development\n\nThis project uses poetry for dependency management. To get started, clone the repo and run\n\n```bash\npoetry install -E dev -E test\n```\n\nWe use `ruff` and `mypy`.\n\n## Contributing\n\nPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.\n\n",
"bugtrack_url": null,
"license": "BSD-3-Clause",
"summary": "Chat Plugin Experiments, Simplified. Create agents and give them superpowers in your notebooks.",
"version": "1.1.1",
"project_urls": {
"Homepage": "https://github.com/rgbkrk/chatlab"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "33f28787949e324c773bc1e9b1d513ad34a344b4c426c2d34ac4bc04457ed787",
"md5": "ea7e22ed4c7b4684d8e29e1093e624e6",
"sha256": "99a5930819309e496091d353b2dead9331724f8cabee928c780734dd2e00b2d9"
},
"downloads": -1,
"filename": "chatlab-1.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ea7e22ed4c7b4684d8e29e1093e624e6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9,<4.0",
"size": 35998,
"upload_time": "2023-11-21T02:32:19",
"upload_time_iso_8601": "2023-11-21T02:32:19.347900Z",
"url": "https://files.pythonhosted.org/packages/33/f2/8787949e324c773bc1e9b1d513ad34a344b4c426c2d34ac4bc04457ed787/chatlab-1.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5d2e9822bfabda7724d8790a3cd921696f4aa8a8080137851a0119339c9e8d33",
"md5": "32d3b424bb5a0b6191a208ce1a71a2fa",
"sha256": "8e96c90651c0fd1c97609c81b4c7337981d63af274c4bf5e0cb77661ef358d97"
},
"downloads": -1,
"filename": "chatlab-1.1.1.tar.gz",
"has_sig": false,
"md5_digest": "32d3b424bb5a0b6191a208ce1a71a2fa",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9,<4.0",
"size": 36968,
"upload_time": "2023-11-21T02:32:21",
"upload_time_iso_8601": "2023-11-21T02:32:21.545172Z",
"url": "https://files.pythonhosted.org/packages/5d/2e/9822bfabda7724d8790a3cd921696f4aa8a8080137851a0119339c9e8d33/chatlab-1.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-11-21 02:32:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rgbkrk",
"github_project": "chatlab",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "chatlab"
}