Name | bespoken JSON |
Version |
0.2.2
JSON |
| download |
home_page | None |
Summary | A toolbox to build your own assistant for in the terminal. |
upload_time | 2025-07-11 21:37:23 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.8 |
license | MIT |
keywords |
ai
assistant
llm
code-editor
marimo
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# bespoken
```
██████╗ ███████╗███████╗██████╗ ██████╗ ██╗ ██╗███████╗███╗ ██╗
██╔══██╗██╔════╝██╔════╝██╔══██╗██╔═══██╗██║ ██╔╝██╔════╝████╗ ██║
██████╔╝█████╗ ███████╗██████╔╝██║ ██║█████╔╝ █████╗ ██╔██╗ ██║
██╔══██╗██╔══╝ ╚════██║██╔═══╝ ██║ ██║██╔═██╗ ██╔══╝ ██║╚██╗██║
██████╔╝███████╗███████║██║ ╚██████╔╝██║ ██╗███████╗██║ ╚████║
╚═════╝ ╚══════╝╚══════╝╚═╝ ╚═════╝ ╚═╝ ╚═╝╚══════╝╚═╝ ╚═══╝
A terminal chat experience that you can configure yourself.
```
## Installation
Basic installation:
```bash
uv pip install bespoken
```
## Usage
This library uses [llm](https://llm.datasette.io/en/stable/) under the hood to provide you with building blocks to make LLM chat interfaces from the commandline. Here's an example:

This interface was defined via below:
```python
from bespoken import chat
from bespoken.tools import FileTool, TodoTools, PlaywrightTool
chat(
model_name="anthropic/claude-3-5-sonnet-20240620",
tools=[FileTool("edit.py")],
system_prompt="You are a coding assistant that can make edits to a single file.",
debug=True,
)
```
## Features
### Autocomplete
Tab completion for commands and file paths. Use `@file.py` to get file path suggestions, "/" + <kbd>TAB></kbd> to autocomplete commands or use arrow keys for command history.

### Custom slash commands
Define your own `/commands` that either send text to the LLM or trigger interactive functions:
```python
def save_conversation():
"""Save conversation to file"""
filename = ui.input("Filename: ")
return f"Saved to {filename}"
chat(
...,
slash_commands={
"/save": save_conversation,
"/formal": "Please respond in a formal manner.",
}
)
```
## Why?
The goal is to host a bunch of tools that you can pass to the LLM, but the main idea here is that you can also make it easy to constrain the chat. The `FileTool`, for example, only allows the LLM to make edits to a single file declared upfront. This significantly reduces any injection risks and still covers a lot of use-cases. It is also a nice exercise to make tools like claude code feel less magical, and you can also swap out the LLM with any other one as you see fit.
This project is in early days at the moment, but it feels exciting to work on!
Raw data
{
"_id": null,
"home_page": null,
"name": "bespoken",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "ai, assistant, llm, code-editor, marimo",
"author": null,
"author_email": "Your Name <your.email@example.com>",
"download_url": "https://files.pythonhosted.org/packages/a3/81/6e9b6238104e8e5b8ed8969736ea415c8efa6d7c4e95610bb392f0d057d3/bespoken-0.2.2.tar.gz",
"platform": null,
"description": "# bespoken\n\n```\n\n\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2557 \u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2557 \u2588\u2588\u2557\n\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551 \u2588\u2588\u2554\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2551\n\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2554\u255d \u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2554\u2588\u2588\u2557 \u2588\u2588\u2551\n\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2550\u255d \u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2588\u2588\u2557 \u2588\u2588\u2554\u2550\u2550\u255d \u2588\u2588\u2551\u255a\u2588\u2588\u2557\u2588\u2588\u2551\n\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2551\u2588\u2588\u2551 \u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551 \u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2551 \u255a\u2588\u2588\u2588\u2588\u2551\n\u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d\u255a\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u255d \u255a\u2550\u255d\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d\u255a\u2550\u255d \u255a\u2550\u2550\u2550\u255d\n\n\nA terminal chat experience that you can configure yourself.\n```\n\n## Installation\n\nBasic installation:\n\n```bash\nuv pip install bespoken\n```\n\n## Usage\n\nThis library uses [llm](https://llm.datasette.io/en/stable/) under the hood to provide you with building blocks to make LLM chat interfaces from the commandline. Here's an example:\n\n\n\nThis interface was defined via below:\n\n```python\nfrom bespoken import chat\nfrom bespoken.tools import FileTool, TodoTools, PlaywrightTool\n\nchat(\n model_name=\"anthropic/claude-3-5-sonnet-20240620\",\n tools=[FileTool(\"edit.py\")],\n system_prompt=\"You are a coding assistant that can make edits to a single file.\",\n debug=True,\n)\n```\n\n## Features \n\n### Autocomplete \n\nTab completion for commands and file paths. Use `@file.py` to get file path suggestions, \"/\" + <kbd>TAB></kbd> to autocomplete commands or use arrow keys for command history.\n\n\n\n### Custom slash commands\n\nDefine your own `/commands` that either send text to the LLM or trigger interactive functions:\n\n```python\ndef save_conversation():\n \"\"\"Save conversation to file\"\"\"\n filename = ui.input(\"Filename: \")\n return f\"Saved to {filename}\"\n\nchat(\n ...,\n slash_commands={\n \"/save\": save_conversation,\n \"/formal\": \"Please respond in a formal manner.\",\n }\n)\n```\n\n## Why? \n\nThe goal is to host a bunch of tools that you can pass to the LLM, but the main idea here is that you can also make it easy to constrain the chat. The `FileTool`, for example, only allows the LLM to make edits to a single file declared upfront. This significantly reduces any injection risks and still covers a lot of use-cases. It is also a nice exercise to make tools like claude code feel less magical, and you can also swap out the LLM with any other one as you see fit. \n\nThis project is in early days at the moment, but it feels exciting to work on!\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A toolbox to build your own assistant for in the terminal.",
"version": "0.2.2",
"project_urls": {
"Documentation": "https://github.com/yourusername/bespoken#readme",
"Homepage": "https://github.com/yourusername/bespoken",
"Issues": "https://github.com/yourusername/bespoken/issues",
"Repository": "https://github.com/yourusername/bespoken.git"
},
"split_keywords": [
"ai",
" assistant",
" llm",
" code-editor",
" marimo"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "2e12b3124c82d42d7afa04f15ed08b3e691ac6a0611f1c9d69220ecb280d1117",
"md5": "358d5f79a8a3a11cc71f8901dee549d3",
"sha256": "21340c09ffc2031c29335a4ec3870b54158867c26095934ee6a2576e1d0c9981"
},
"downloads": -1,
"filename": "bespoken-0.2.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "358d5f79a8a3a11cc71f8901dee549d3",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 24048,
"upload_time": "2025-07-11T21:37:21",
"upload_time_iso_8601": "2025-07-11T21:37:21.929651Z",
"url": "https://files.pythonhosted.org/packages/2e/12/b3124c82d42d7afa04f15ed08b3e691ac6a0611f1c9d69220ecb280d1117/bespoken-0.2.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a3816e9b6238104e8e5b8ed8969736ea415c8efa6d7c4e95610bb392f0d057d3",
"md5": "e6578e8e082824c3f61ed420746fcbc7",
"sha256": "c8533f91aeeaa6638a5762a74a46f8b5df82e0df476746fe88cce1247f17e14e"
},
"downloads": -1,
"filename": "bespoken-0.2.2.tar.gz",
"has_sig": false,
"md5_digest": "e6578e8e082824c3f61ed420746fcbc7",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 26795,
"upload_time": "2025-07-11T21:37:23",
"upload_time_iso_8601": "2025-07-11T21:37:23.259854Z",
"url": "https://files.pythonhosted.org/packages/a3/81/6e9b6238104e8e5b8ed8969736ea415c8efa6d7c4e95610bb392f0d057d3/bespoken-0.2.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 21:37:23",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yourusername",
"github_project": "bespoken#readme",
"github_not_found": true,
"lcname": "bespoken"
}