<img src="img/bashimu.jpeg" alt="drawing" width="200"/>
# Bashimu
A command-line LLM chat tool that supports multiple AI providers with customizable personas.
## Features
- **Multiple AI Providers**: OpenAI, Anthropic Claude, Google Gemini, and Ollama
- **Customizable Personas**: Define AI behavior and conversation styles
- **Interactive REPL**: Rich terminal interface with markdown rendering
- **Non-Interactive Mode**: Send queries directly from command line
- **Conversation Management**: Save, edit, and review chat history
- **Provider & Model Switching**: Change providers and models on the fly
## Installation
```bash
pip install bashimu
```
## Quick Start
### Interactive Mode
Launch the interactive chat:
```bash
bashimu-cli
```
### Non-Interactive Mode
Send a single query and get the response:
```bash
bashimu-cli "What is the capital of France?"
```
Use with specific provider or persona:
```bash
bashimu-cli --provider anthropic "Explain Python decorators"
bashimu-cli --persona coding_mentor "Review this code: def foo(): pass"
```
Perfect for scripting and piping:
```bash
bashimu-cli "Generate a JSON list of 5 colors" | jq '.'
```
## Configuration
On first run, bashimu creates configuration files at:
- Config: `~/.config/llm-chat/config.json`
- Personas: `~/.config/llm-chat/personas/`
Edit the config file to add your API keys:
```json
{
"default_provider": "openai",
"default_persona": "default",
"providers": {
"openai": {
"name": "OpenAI",
"api_key": "YOUR_OPENAI_API_KEY",
"default_model": "gpt-4o"
},
"anthropic": {
"name": "Anthropic",
"api_key": "YOUR_ANTHROPIC_API_KEY",
"default_model": "claude-3-haiku-20240307"
}
}
}
```
## Interactive Commands
When in interactive mode, use these commands:
- `/help` - Show help message
- `/clear` - Clear conversation history
- `/history` - Show conversation history
- `/save` - Save conversation to JSON file
- `/edit` - Edit and resend your last message
- `/provider [name]` - Switch AI provider
- `/models` - List available models
- `/model [name]` - Switch model
- `/personas` - List available personas
- `/persona [name]` - Switch persona (clears chat)
- `/quit`, `/q`, `/exit` - Exit the chat
## Creating Custom Personas
Create JSON files in `~/.config/llm-chat/personas/`:
```json
{
"name": "Code Reviewer",
"provider": "anthropic",
"model": "claude-3-opus-20240229",
"user_identity": "A developer seeking code review",
"ai_identity": "An expert code reviewer",
"conversation_goal": "To review code for bugs, style, and best practices",
"response_style": "Technical, thorough, and constructive"
}
```
## Command-Line Options
```
bashimu-cli [query] [options]
Positional Arguments:
query Optional query to send (non-interactive mode)
Options:
--provider PROVIDER Override the default provider
--persona PERSONA Override the default persona
--config CONFIG Path to a custom config file
-h, --help Show help message
```
## License
MIT License - see LICENSE file for details
## Links
- Homepage: https://github.com/wiktorjl/bashimu
- Issues: https://github.com/wiktorjl/bashimu/issues
Raw data
{
"_id": null,
"home_page": null,
"name": "bashimu",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "example, demo, packaging",
"author": null,
"author_email": "Wiktor Lukasik <contact@wiktor.io>",
"download_url": "https://files.pythonhosted.org/packages/bb/87/f30ada1027e9502901bf0755bd40ae8ea700e9d67e20f3563c3d4b8f8581/bashimu-1.2.3.tar.gz",
"platform": null,
"description": "<img src=\"img/bashimu.jpeg\" alt=\"drawing\" width=\"200\"/>\n\n# Bashimu\n\nA command-line LLM chat tool that supports multiple AI providers with customizable personas.\n\n## Features\n\n- **Multiple AI Providers**: OpenAI, Anthropic Claude, Google Gemini, and Ollama\n- **Customizable Personas**: Define AI behavior and conversation styles\n- **Interactive REPL**: Rich terminal interface with markdown rendering\n- **Non-Interactive Mode**: Send queries directly from command line\n- **Conversation Management**: Save, edit, and review chat history\n- **Provider & Model Switching**: Change providers and models on the fly\n\n## Installation\n\n```bash\npip install bashimu\n```\n\n## Quick Start\n\n### Interactive Mode\n\nLaunch the interactive chat:\n\n```bash\nbashimu-cli\n```\n\n### Non-Interactive Mode\n\nSend a single query and get the response:\n\n```bash\nbashimu-cli \"What is the capital of France?\"\n```\n\nUse with specific provider or persona:\n\n```bash\nbashimu-cli --provider anthropic \"Explain Python decorators\"\nbashimu-cli --persona coding_mentor \"Review this code: def foo(): pass\"\n```\n\nPerfect for scripting and piping:\n\n```bash\nbashimu-cli \"Generate a JSON list of 5 colors\" | jq '.'\n```\n\n## Configuration\n\nOn first run, bashimu creates configuration files at:\n- Config: `~/.config/llm-chat/config.json`\n- Personas: `~/.config/llm-chat/personas/`\n\nEdit the config file to add your API keys:\n\n```json\n{\n \"default_provider\": \"openai\",\n \"default_persona\": \"default\",\n \"providers\": {\n \"openai\": {\n \"name\": \"OpenAI\",\n \"api_key\": \"YOUR_OPENAI_API_KEY\",\n \"default_model\": \"gpt-4o\"\n },\n \"anthropic\": {\n \"name\": \"Anthropic\",\n \"api_key\": \"YOUR_ANTHROPIC_API_KEY\",\n \"default_model\": \"claude-3-haiku-20240307\"\n }\n }\n}\n```\n\n## Interactive Commands\n\nWhen in interactive mode, use these commands:\n\n- `/help` - Show help message\n- `/clear` - Clear conversation history\n- `/history` - Show conversation history\n- `/save` - Save conversation to JSON file\n- `/edit` - Edit and resend your last message\n- `/provider [name]` - Switch AI provider\n- `/models` - List available models\n- `/model [name]` - Switch model\n- `/personas` - List available personas\n- `/persona [name]` - Switch persona (clears chat)\n- `/quit`, `/q`, `/exit` - Exit the chat\n\n## Creating Custom Personas\n\nCreate JSON files in `~/.config/llm-chat/personas/`:\n\n```json\n{\n \"name\": \"Code Reviewer\",\n \"provider\": \"anthropic\",\n \"model\": \"claude-3-opus-20240229\",\n \"user_identity\": \"A developer seeking code review\",\n \"ai_identity\": \"An expert code reviewer\",\n \"conversation_goal\": \"To review code for bugs, style, and best practices\",\n \"response_style\": \"Technical, thorough, and constructive\"\n}\n```\n\n## Command-Line Options\n\n```\nbashimu-cli [query] [options]\n\nPositional Arguments:\n query Optional query to send (non-interactive mode)\n\nOptions:\n --provider PROVIDER Override the default provider\n --persona PERSONA Override the default persona\n --config CONFIG Path to a custom config file\n -h, --help Show help message\n```\n\n## License\n\nMIT License - see LICENSE file for details\n\n## Links\n\n- Homepage: https://github.com/wiktorjl/bashimu\n- Issues: https://github.com/wiktorjl/bashimu/issues\n",
"bugtrack_url": null,
"license": null,
"summary": "LLM chat",
"version": "1.2.3",
"project_urls": {
"Bug Tracker": "https://github.com/wiktorjl/bashimu/issues",
"Homepage": "https://github.com/wiktorjl/bashimu"
},
"split_keywords": [
"example",
" demo",
" packaging"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "22837c79fccc50eba635ac33e0c10ff06f4ccf03b73757c260a91edb8ecb50ae",
"md5": "1c09e8219789257683de5f25e51ba9b1",
"sha256": "097377f97c28fa98c766cce7e6cac3b7426bb9def3752b92962ea548ea1c657c"
},
"downloads": -1,
"filename": "bashimu-1.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1c09e8219789257683de5f25e51ba9b1",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 11448,
"upload_time": "2025-10-30T16:50:30",
"upload_time_iso_8601": "2025-10-30T16:50:30.078704Z",
"url": "https://files.pythonhosted.org/packages/22/83/7c79fccc50eba635ac33e0c10ff06f4ccf03b73757c260a91edb8ecb50ae/bashimu-1.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "bb87f30ada1027e9502901bf0755bd40ae8ea700e9d67e20f3563c3d4b8f8581",
"md5": "55467ed904717c077d05e83d6fc65a3a",
"sha256": "91468162a78adcc2e1bf73719e9de2822a533564e07de06ad9ee4c0d203d6ff4"
},
"downloads": -1,
"filename": "bashimu-1.2.3.tar.gz",
"has_sig": false,
"md5_digest": "55467ed904717c077d05e83d6fc65a3a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 12261,
"upload_time": "2025-10-30T16:50:32",
"upload_time_iso_8601": "2025-10-30T16:50:32.325038Z",
"url": "https://files.pythonhosted.org/packages/bb/87/f30ada1027e9502901bf0755bd40ae8ea700e9d67e20f3563c3d4b8f8581/bashimu-1.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-30 16:50:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "wiktorjl",
"github_project": "bashimu",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "prompt_toolkit",
"specs": []
},
{
"name": "openai",
"specs": []
},
{
"name": "requests",
"specs": []
},
{
"name": "rich",
"specs": []
}
],
"lcname": "bashimu"
}