Name | tunacode-cli JSON |
Version |
0.0.66
JSON |
| download |
home_page | None |
Summary | Your agentic CLI developer. |
upload_time | 2025-08-14 19:04:57 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.14,>=3.10 |
license | MIT |
keywords |
agent
automation
cli
development
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# TunaCode CLI
<div align="center">
[](https://badge.fury.io/py/tunacode-cli)
[](https://pepy.tech/project/tunacode-cli)
[](https://www.python.org/downloads/)
[](https://opensource.org/licenses/MIT)
**AI-powered CLI coding assistant**

</div>
---
## Quick Install
```bash
# Option 1: One-line install (Linux/macOS)
wget -qO- https://raw.githubusercontent.com/alchemiststudiosDOTai/tunacode/master/scripts/install_linux.sh | bash
# Option 2: pip install
pip install tunacode-cli
```
## Development Installation
For contributors and developers who want to work on TunaCode:
```bash
# Clone the repository
git clone https://github.com/alchemiststudiosDOTai/tunacode.git
cd tunacode
# Quick setup (recommended)
./scripts/setup_dev_env.sh
# Or manual setup
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -e ".[dev]"
# Verify installation
python -m tunacode --version
```
See [Development Guide](docs/DEVELOPMENT.md) for detailed instructions.
## Configuration
Choose your AI provider and set your API key:
```bash
# OpenAI
tunacode --model "openai:gpt-4o" --key "sk-your-openai-key"
# Anthropic Claude
tunacode --model "anthropic:claude-3.5-sonnet" --key "sk-ant-your-anthropic-key"
# OpenRouter (100+ models)
tunacode --model "openrouter:openai/gpt-4o" --key "sk-or-your-openrouter-key"
```
Your config is saved to `~/.config/tunacode.json`. This file stores your API keys, model preferences, and runtime settings like `max_iterations` (default: 40) and `context_window_size`. You can edit it directly with `nvim ~/.config/tunacode.json` or see [the complete configuration example](documentation/configuration/config-file-example.md) for all available options.
### Recommended Models
Based on extensive testing, these models provide the best performance:
- `google/gemini-2.5-pro` - Excellent for complex reasoning
- `openai/gpt-4.1` - Strong general-purpose model
- `deepseek/deepseek-r1-0528` - Great for code generation
- `openai/gpt-4.1-mini` - Fast and cost-effective
- `anthropic/claude-4-sonnet-20250522` - Superior context handling
_Note: Formal evaluations coming soon. Any model can work, but these have shown the best results in practice._
## Start Coding
```bash
tunacode
```
## Basic Commands
| Command | Description |
| ------------------------ | ---------------------- |
| `/help` | Show all commands |
| `/model <provider:name>` | Switch model |
| `/clear` | Clear message history |
| `/compact` | Summarize conversation |
| `/branch <name>` | Create Git branch |
| `/yolo` | Skip confirmations |
| `!<command>` | Run shell command |
| `exit` | Exit TunaCode |
## Performance
TunaCode leverages parallel execution for read-only operations, achieving **3x faster** file operations:

Multiple file reads, directory listings, and searches execute concurrently using async I/O, making code exploration significantly faster.
## Features in Development
- **Streaming UI**: Currently working on implementing streaming responses for better user experience
- **Bug Fixes**: Actively addressing issues - please report any bugs you encounter!
_Note: While the tool is fully functional, we're focusing on stability and core features before optimizing for speed._
## Safety First
⚠️ **Important**: TunaCode can modify your codebase. Always:
- Use Git branches before making changes
- Review file modifications before confirming
- Keep backups of important work
## Documentation
- [**Features**](docs/FEATURES.md) - All features, tools, and commands
- [**Advanced Configuration**](docs/ADVANCED-CONFIG.md) - Provider setup, MCP, customization
- [**Architecture**](docs/ARCHITECTURE.md) - Source code organization and design
- [**Development**](docs/DEVELOPMENT.md) - Contributing and development setup
- [**Troubleshooting**](docs/TROUBLESHOOTING.md) - Common issues and solutions
## Links
- [PyPI Package](https://pypi.org/project/tunacode-cli/)
- [GitHub Repository](https://github.com/alchemiststudiosDOTai/tunacode)
- [Report Issues](https://github.com/alchemiststudiosDOTai/tunacode/issues)
---
MIT License - see [LICENSE](LICENSE) file
hello from tuna world
hello world
## Getting Started
### Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.10 or higher
- Git
### Installation
To install TunaCode, you can use one of the following methods:
```bash
# Option 1: One-line install (Linux/macOS)
wget -qO- https://raw.githubusercontent.com/alchemiststudiosDOTai/tunacode/master/scripts/install_linux.sh | bash
# Option 2: pip install
pip install tunacode-cli
```
### Development Installation
For developers who want to contribute to TunaCode:
```bash
# Clone the repository
git clone https://github.com/alchemiststudiosDOTai/tunacode.git
cd tunacode
# Quick setup (recommended)
./scripts/setup_dev_env.sh
# Or manual setup
python3 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
pip install -e ".[dev]"
# Verify installation
python -m tunacode --version
```
See [Development Guide](docs/DEVELOPMENT.md) for detailed instructions.
### Configuration
Choose your AI provider and set your API key:
```bash
# OpenAI
tunacode --model "openai:gpt-4o" --key "sk-your-openai-key"
# Anthropic Claude
tunacode --model "anthropic:claude-3.5-sonnet" --key "sk-ant-your-anthropic-key"
# OpenRouter (100+ models)
tunacode --model "openrouter:openai/gpt-4o" --key "sk-or-your-openrouter-key"
```
Your config is saved to `~/.config/tunacode.json`. This file stores your API keys, model preferences, and runtime settings like `max_iterations` (default: 40) and `context_window_size`. You can edit it directly with `nvim ~/.config/tunacode.json` or see [the complete configuration example](documentation/configuration/config-file-example.md) for all available options.
### Recommended Models
Based on extensive testing, these models provide the best performance:
- `google/gemini-2.5-pro` - Excellent for complex reasoning
- `openai/gpt-4.1` - Strong general-purpose model
- `deepseek/deepseek-r1-0528` - Great for code generation
- `openai/gpt-4.1-mini` - Fast and cost-effective
- `anthropic/claude-4-sonnet-20250522` - Superior context handling
_Note: Formal evaluations coming soon. Any model can work, but these have shown the best results in practice._
## Usage
### Starting TunaCode
```bash
tunacode
```
### Basic Commands
| Command | Description |
| ------------------------ | ---------------------- |
| `/help` | Show all commands |
| `/model <provider:name>` | Switch model |
| `/clear` | Clear message history |
| `/compact` | Summarize conversation |
| `/branch <name>` | Create Git branch |
| `/yolo` | Skip confirmations |
| `!<command>` | Run shell command |
| `exit` | Exit TunaCode |
## Performance
TunaCode leverages parallel execution for read-only operations, achieving **3x faster** file operations:

Multiple file reads, directory listings, and searches execute concurrently using async I/O, making code exploration significantly faster.
## Features in Development
- **Streaming UI**: Currently working on implementing streaming responses for better user experience
- **Bug Fixes**: Actively addressing issues - please report any bugs you encounter!
_Note: While the tool is fully functional, we're focusing on stability and core features before optimizing for speed._
## Safety First
⚠️ **Important**: TunaCode can modify your codebase. Always:
- Use Git branches before making changes
- Review file modifications before confirming
- Keep backups of important work
## Documentation
- [**Features**](docs/FEATURES.md) - All features, tools, and commands
- [**Advanced Configuration**](docs/ADVANCED-CONFIG.md) - Provider setup, MCP, customization
- [**Architecture**](docs/ARCHITECTURE.md) - Source code organization and design
- [**Development**](docs/DEVELOPMENT.md) - Contributing and development setup
- [**Troubleshooting**](docs/TROUBLESHOOTING.md) - Common issues and solutions
## Links
- [PyPI Package](https://pypi.org/project/tunacode-cli/)
- [GitHub Repository](https://github.com/alchemiststudiosDOTai/tunacode)
- [Report Issues](https://github.com/alchemiststudiosDOTai/tunacode/issues)
---
MIT License - see [LICENSE](LICENSE) file
# Test
# Test
Raw data
{
"_id": null,
"home_page": null,
"name": "tunacode-cli",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.14,>=3.10",
"maintainer_email": null,
"keywords": "agent, automation, cli, development",
"author": null,
"author_email": "larock22 <noreply@github.com>",
"download_url": "https://files.pythonhosted.org/packages/2a/d2/37d68bf1a26d40cd38deba3c36ad68a6489f9e007d405f966ff5d8810b82/tunacode_cli-0.0.66.tar.gz",
"platform": null,
"description": "# TunaCode CLI\n\n<div align=\"center\">\n\n[](https://badge.fury.io/py/tunacode-cli)\n[](https://pepy.tech/project/tunacode-cli)\n[](https://www.python.org/downloads/)\n[](https://opensource.org/licenses/MIT)\n\n**AI-powered CLI coding assistant**\n\n\n\n</div>\n\n---\n\n## Quick Install\n\n```bash\n# Option 1: One-line install (Linux/macOS)\nwget -qO- https://raw.githubusercontent.com/alchemiststudiosDOTai/tunacode/master/scripts/install_linux.sh | bash\n\n# Option 2: pip install\npip install tunacode-cli\n```\n\n## Development Installation\n\nFor contributors and developers who want to work on TunaCode:\n\n```bash\n# Clone the repository\ngit clone https://github.com/alchemiststudiosDOTai/tunacode.git\ncd tunacode\n\n# Quick setup (recommended)\n./scripts/setup_dev_env.sh\n\n# Or manual setup\npython3 -m venv venv\nsource venv/bin/activate # On Windows: venv\\Scripts\\activate\npip install -e \".[dev]\"\n\n# Verify installation\npython -m tunacode --version\n```\n\nSee [Development Guide](docs/DEVELOPMENT.md) for detailed instructions.\n\n## Configuration\n\nChoose your AI provider and set your API key:\n\n```bash\n# OpenAI\ntunacode --model \"openai:gpt-4o\" --key \"sk-your-openai-key\"\n\n# Anthropic Claude\ntunacode --model \"anthropic:claude-3.5-sonnet\" --key \"sk-ant-your-anthropic-key\"\n\n# OpenRouter (100+ models)\ntunacode --model \"openrouter:openai/gpt-4o\" --key \"sk-or-your-openrouter-key\"\n```\n\nYour config is saved to `~/.config/tunacode.json`. This file stores your API keys, model preferences, and runtime settings like `max_iterations` (default: 40) and `context_window_size`. You can edit it directly with `nvim ~/.config/tunacode.json` or see [the complete configuration example](documentation/configuration/config-file-example.md) for all available options.\n\n### Recommended Models\n\nBased on extensive testing, these models provide the best performance:\n\n- `google/gemini-2.5-pro` - Excellent for complex reasoning\n- `openai/gpt-4.1` - Strong general-purpose model\n- `deepseek/deepseek-r1-0528` - Great for code generation\n- `openai/gpt-4.1-mini` - Fast and cost-effective\n- `anthropic/claude-4-sonnet-20250522` - Superior context handling\n\n_Note: Formal evaluations coming soon. Any model can work, but these have shown the best results in practice._\n\n## Start Coding\n\n```bash\ntunacode\n```\n\n## Basic Commands\n\n| Command | Description |\n| ------------------------ | ---------------------- |\n| `/help` | Show all commands |\n| `/model <provider:name>` | Switch model |\n| `/clear` | Clear message history |\n| `/compact` | Summarize conversation |\n| `/branch <name>` | Create Git branch |\n| `/yolo` | Skip confirmations |\n| `!<command>` | Run shell command |\n| `exit` | Exit TunaCode |\n\n## Performance\n\nTunaCode leverages parallel execution for read-only operations, achieving **3x faster** file operations:\n\n\n\nMultiple file reads, directory listings, and searches execute concurrently using async I/O, making code exploration significantly faster.\n\n## Features in Development\n\n- **Streaming UI**: Currently working on implementing streaming responses for better user experience\n- **Bug Fixes**: Actively addressing issues - please report any bugs you encounter!\n\n_Note: While the tool is fully functional, we're focusing on stability and core features before optimizing for speed._\n\n## Safety First\n\n\u26a0\ufe0f **Important**: TunaCode can modify your codebase. Always:\n\n- Use Git branches before making changes\n- Review file modifications before confirming\n- Keep backups of important work\n\n## Documentation\n\n- [**Features**](docs/FEATURES.md) - All features, tools, and commands\n- [**Advanced Configuration**](docs/ADVANCED-CONFIG.md) - Provider setup, MCP, customization\n- [**Architecture**](docs/ARCHITECTURE.md) - Source code organization and design\n- [**Development**](docs/DEVELOPMENT.md) - Contributing and development setup\n- [**Troubleshooting**](docs/TROUBLESHOOTING.md) - Common issues and solutions\n\n## Links\n\n- [PyPI Package](https://pypi.org/project/tunacode-cli/)\n- [GitHub Repository](https://github.com/alchemiststudiosDOTai/tunacode)\n- [Report Issues](https://github.com/alchemiststudiosDOTai/tunacode/issues)\n\n---\n\nMIT License - see [LICENSE](LICENSE) file\n\nhello from tuna world\n\nhello world\n\n## Getting Started\n\n### Prerequisites\n\nBefore you begin, ensure you have the following installed:\n\n- Python 3.10 or higher\n- Git\n\n### Installation\n\nTo install TunaCode, you can use one of the following methods:\n\n```bash\n# Option 1: One-line install (Linux/macOS)\nwget -qO- https://raw.githubusercontent.com/alchemiststudiosDOTai/tunacode/master/scripts/install_linux.sh | bash\n\n# Option 2: pip install\npip install tunacode-cli\n```\n\n### Development Installation\n\nFor developers who want to contribute to TunaCode:\n\n```bash\n# Clone the repository\ngit clone https://github.com/alchemiststudiosDOTai/tunacode.git\ncd tunacode\n\n# Quick setup (recommended)\n./scripts/setup_dev_env.sh\n\n# Or manual setup\npython3 -m venv venv\nsource venv/bin/activate # On Windows: venv\\Scripts\\activate\npip install -e \".[dev]\"\n\n# Verify installation\npython -m tunacode --version\n```\n\nSee [Development Guide](docs/DEVELOPMENT.md) for detailed instructions.\n\n### Configuration\n\nChoose your AI provider and set your API key:\n\n```bash\n# OpenAI\ntunacode --model \"openai:gpt-4o\" --key \"sk-your-openai-key\"\n\n# Anthropic Claude\ntunacode --model \"anthropic:claude-3.5-sonnet\" --key \"sk-ant-your-anthropic-key\"\n\n# OpenRouter (100+ models)\ntunacode --model \"openrouter:openai/gpt-4o\" --key \"sk-or-your-openrouter-key\"\n```\n\nYour config is saved to `~/.config/tunacode.json`. This file stores your API keys, model preferences, and runtime settings like `max_iterations` (default: 40) and `context_window_size`. You can edit it directly with `nvim ~/.config/tunacode.json` or see [the complete configuration example](documentation/configuration/config-file-example.md) for all available options.\n\n### Recommended Models\n\nBased on extensive testing, these models provide the best performance:\n\n- `google/gemini-2.5-pro` - Excellent for complex reasoning\n- `openai/gpt-4.1` - Strong general-purpose model\n- `deepseek/deepseek-r1-0528` - Great for code generation\n- `openai/gpt-4.1-mini` - Fast and cost-effective\n- `anthropic/claude-4-sonnet-20250522` - Superior context handling\n\n_Note: Formal evaluations coming soon. Any model can work, but these have shown the best results in practice._\n\n## Usage\n\n### Starting TunaCode\n\n```bash\ntunacode\n```\n\n### Basic Commands\n\n| Command | Description |\n| ------------------------ | ---------------------- |\n| `/help` | Show all commands |\n| `/model <provider:name>` | Switch model |\n| `/clear` | Clear message history |\n| `/compact` | Summarize conversation |\n| `/branch <name>` | Create Git branch |\n| `/yolo` | Skip confirmations |\n| `!<command>` | Run shell command |\n| `exit` | Exit TunaCode |\n\n## Performance\n\nTunaCode leverages parallel execution for read-only operations, achieving **3x faster** file operations:\n\n\n\nMultiple file reads, directory listings, and searches execute concurrently using async I/O, making code exploration significantly faster.\n\n## Features in Development\n\n- **Streaming UI**: Currently working on implementing streaming responses for better user experience\n- **Bug Fixes**: Actively addressing issues - please report any bugs you encounter!\n\n_Note: While the tool is fully functional, we're focusing on stability and core features before optimizing for speed._\n\n## Safety First\n\n\u26a0\ufe0f **Important**: TunaCode can modify your codebase. Always:\n\n- Use Git branches before making changes\n- Review file modifications before confirming\n- Keep backups of important work\n\n## Documentation\n\n- [**Features**](docs/FEATURES.md) - All features, tools, and commands\n- [**Advanced Configuration**](docs/ADVANCED-CONFIG.md) - Provider setup, MCP, customization\n- [**Architecture**](docs/ARCHITECTURE.md) - Source code organization and design\n- [**Development**](docs/DEVELOPMENT.md) - Contributing and development setup\n- [**Troubleshooting**](docs/TROUBLESHOOTING.md) - Common issues and solutions\n\n## Links\n\n- [PyPI Package](https://pypi.org/project/tunacode-cli/)\n- [GitHub Repository](https://github.com/alchemiststudiosDOTai/tunacode)\n- [Report Issues](https://github.com/alchemiststudiosDOTai/tunacode/issues)\n\n---\n\nMIT License - see [LICENSE](LICENSE) file\n# Test\n# Test\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Your agentic CLI developer.",
"version": "0.0.66",
"project_urls": {
"Documentation": "https://github.com/alchemiststudiosDOTai/tunacode#readme",
"Homepage": "https://tunacode.xyz/",
"Issues": "https://github.com/alchemiststudiosDOTai/tunacode/issues",
"Repository": "https://github.com/alchemiststudiosDOTai/tunacode"
},
"split_keywords": [
"agent",
" automation",
" cli",
" development"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "68f1a1b8693fcd4e6ea019e59467087a11f22378db4f7b1225e68034a14df665",
"md5": "494c22ed02e01b65144aa3bfbaa7c98a",
"sha256": "fb0a6d46d06d7d8bccb520db7e5c625b5e758c11d7e7a0401bfba32f38a2d500"
},
"downloads": -1,
"filename": "tunacode_cli-0.0.66-py3-none-any.whl",
"has_sig": false,
"md5_digest": "494c22ed02e01b65144aa3bfbaa7c98a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.14,>=3.10",
"size": 224633,
"upload_time": "2025-08-14T19:04:56",
"upload_time_iso_8601": "2025-08-14T19:04:56.043690Z",
"url": "https://files.pythonhosted.org/packages/68/f1/a1b8693fcd4e6ea019e59467087a11f22378db4f7b1225e68034a14df665/tunacode_cli-0.0.66-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2ad237d68bf1a26d40cd38deba3c36ad68a6489f9e007d405f966ff5d8810b82",
"md5": "c617c293d5baabeb3a609eea7f10dee1",
"sha256": "9f1997fd16419b3276fd9d87fde9cdaaef424eafea6f95b9985602c0d7070dfd"
},
"downloads": -1,
"filename": "tunacode_cli-0.0.66.tar.gz",
"has_sig": false,
"md5_digest": "c617c293d5baabeb3a609eea7f10dee1",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.14,>=3.10",
"size": 1298751,
"upload_time": "2025-08-14T19:04:57",
"upload_time_iso_8601": "2025-08-14T19:04:57.741988Z",
"url": "https://files.pythonhosted.org/packages/2a/d2/37d68bf1a26d40cd38deba3c36ad68a6489f9e007d405f966ff5d8810b82/tunacode_cli-0.0.66.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-14 19:04:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "alchemiststudiosDOTai",
"github_project": "tunacode#readme",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "tunacode-cli"
}