songbird-ai


Namesongbird-ai JSON
Version 0.1.14.1 PyPI version JSON
download
home_pageNone
SummaryA terminal-based AI coding companion that runs on local LLMs
upload_time2025-07-14 10:40:40
maintainerNone
docs_urlNone
authorSpandan Chavan
requires_python>=3.10
licenseNone
keywords ai assistant cli coding llm terminal
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Songbird

<div align="center">

<pre style="color: #4A90E2;">
███████╗ ██████╗ ███╗   ██╗ ██████╗ ██████╗ ██╗██████╗ ██████╗ 
██╔════╝██╔═══██╗████╗  ██║██╔════╝ ██╔══██╗██║██╔══██╗██╔══██╗
███████╗██║   ██║██╔██╗ ██║██║  ███╗██████╔╝██║██████╔╝██║  ██║
╚════██║██║   ██║██║╚██╗██║██║   ██║██╔══██╗██║██╔══██╗██║  ██║
███████║╚██████╔╝██║ ╚████║╚██████╔╝██████╔╝██║██║  ██║██████╔╝
╚══════╝ ╚═════╝ ╚═╝  ╚═══╝ ╚═════╝ ╚═════╝ ╚═╝╚═╝  ╚═╝╚═════╝
</pre>

**A terminal-based AI coding companion with 11 professional tools, smart task management, and persistent memory**

[![CI](https://github.com/Spandan7724/songbird/workflows/CI/badge.svg)](https://github.com/Spandan7724/songbird/actions/workflows/ci.yml)
[![PyPI version](https://badge.fury.io/py/songbird-ai.svg)](https://badge.fury.io/py/songbird-ai)
[![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)


</div>


## Downloading

### With uv (recommended)
```bash
uv tool install songbird-ai
```
### With pipx (if available)
```bash
pipx install songbird-ai
```
### With pip (traditional)
```bash
pip install songbird-ai
```


## Quick Start



### Option 1: With Gemini (Recommended)

```bash
# Install Songbird
pipx install songbird-ai

# Get your free Gemini API key
# Visit: https://aistudio.google.com/app/apikey

# Set your API key
export GEMINI_API_KEY="your-api-key-here"

# Start coding with AI
songbird

# Set your default provider (one-time setup)
songbird --default

# Continue your previous session
songbird --continue

# Resume from any previous session
songbird --resume
```

### Option 2: With GitHub Copilot

```bash
# Install Songbird
pipx install songbird-ai

# Get your GitHub Copilot access token from VS Code or JetBrains IDE
# VS Code: Command Palette → "GitHub Copilot: Generate Access Token"
# JetBrains: Settings → Tools → GitHub Copilot → Generate Token

# Set your access token
export COPILOT_ACCESS_TOKEN="your-copilot-token-here"

# Start coding with GitHub Copilot
songbird --provider copilot

# Set Copilot as your default (one-time setup)
songbird default copilot

# Continue previous session with Copilot
songbird --provider copilot --continue
```

### Option 3: With Local Ollama

```bash
# Install Songbird
pipx install songbird-ai

# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# Start Ollama and pull a model
ollama serve
ollama pull devstral:latest

# Start coding with AI
songbird --provider ollama

# Set Ollama as your default (one-time setup)  
songbird default ollama

# Continue previous session with Ollama
songbird --provider ollama --continue
```

## Features


**11 Professional Tools** for complete development workflows:
- **Enhanced File Operations**: `file_search`, `file_read`, `file_create`, `file_edit` with syntax highlighting and diff previews
- **Smart Task Management**: `todo_read`, `todo_write` with automatic prioritization and session persistence
- **Advanced File Discovery**: `glob` pattern matching, `grep` regex search, enhanced `ls` directory listing
- **Atomic Operations**: `multi_edit` for safe bulk file changes with rollback capabilities
- **Shell Integration**: `shell_exec` with live output streaming and cross-platform support

### **Intelligent Task Management**
- **LLM-Powered Auto-Completion**: Automatically detects and completes tasks from natural language - just say "I implemented the JWT tokens" and the system intelligently marks related todos as complete
- **Session-Aware Todos**: Create, track, and complete development tasks with automatic priority assignment
- **Smart Prioritization**: AI analyzes task content to assign appropriate priority levels
- **Clean Visual Display**: Simple bullet points with strikethrough for completed tasks
- **Semantic Understanding**: The LLM understands context - "JWT token system" matches "JWT tokens for authentication"

**Smart Todo Management Example:**
```
# Create todos naturally
"I need to implement JWT authentication and user registration"
✓ Creates: "Implement JWT authentication" and "Add user registration"

# Complete todos intelligently  
"I finished the JWT token system and it's working"
✓ Auto-completes: "Implement JWT authentication" (semantic match)
✓ Shows updated list with strikethrough for completed items

# No manual marking needed - just describe what you did!
``` 

### **Advanced File Discovery & Search**
- **Glob Patterns**: Find files with patterns like `**/*.py`, `src/**/*.js`, `*test*.{py,js}`
- **Regex Content Search**: Powerful regex search with context lines and highlighting
- **Enhanced Directory Listing**: Rich formatted output with sorting and metadata
- **Smart File Detection**: Automatically detects filename vs content searches
- **Type-Specific Search**: Filter by file extensions (py, js, md, txt, json, yaml, etc.)

### **Atomic Multi-File Operations**
- **Bulk Editing**: Edit multiple files simultaneously with safety guarantees
- **Beautiful Previews**: Unified diff display for all changes before applying
- **Rollback Protection**: Automatic rollback if any operation fails
- **Atomic Transactions**: All-or-nothing approach ensures consistency

### **Persistent Memory System**
- **Session Persistence**: Automatic conversation saving with project-aware storage
- **Seamless Continuation**: Resume exactly where you left off with `--continue`
- **Session Browser**: Interactive menu to select from previous sessions with `--resume`
- **Project Isolation**: Each git repository gets separate session storage
- **Visual Replay**: Perfect restoration of conversation history with tool outputs

### **Dynamic Command System**
- **In-Chat Commands**: Type `/` for instant command access without leaving conversation
- **Real-Time Model Switching**: Change models with `/model` command - no session restart needed
- **Model Persistence**: Model changes automatically save and restore across sessions
- **Help System**: Comprehensive `/help` command with examples and documentation
- **Session Management**: `/clear` command for conversation management

### **Multi-Provider AI Support**
- **6 AI Providers**: OpenAI, Anthropic Claude, Google Gemini, GitHub Copilot, Ollama, and OpenRouter
- **Automatic Provider Selection**: Intelligent fallback based on available API keys
- **Cloud & Local**: Use powerful cloud models or private local models
- **Dynamic Switching**: Switch models and providers instantly during conversations

### **Safety & Security**
- **Repository Sandboxing**: Cannot access files outside your project
- **Diff Previews**: Review all changes before applying with beautiful unified diffs
- **Atomic Operations**: Safe multi-file editing with automatic rollback
- **Input Validation**: Comprehensive validation for all tool operations


## Installation

### Recommended: pipx (for CLI tools)

```bash
# Install with pipx (isolated, globally available)
pipx install songbird-ai

# Verify installation
songbird --help
```

### Alternative: uv (fast package manager)

```bash
# Install with uv
uv tool install songbird-ai

# Verify installation
songbird --help
```

### Traditional: pip

```bash
# Install with pip (may conflict with other packages)
pip install songbird-ai
```

##  Getting Started

### 1. Install Ollama

<details>
<summary> Linux/WSL</summary>

```bash
curl -fsSL https://ollama.ai/install.sh | sh
```
</details>

<details>
<summary> macOS</summary>

```bash
# Using Homebrew
brew install ollama

# Or download from https://ollama.ai/download
```
</details>

<details>
<summary>Windows</summary>

Download and install from [https://ollama.ai/download](https://ollama.ai/download)
</details>

### 2. Start Ollama Server

```bash
ollama serve
```

### 3. Pull a Coding Model

```bash
# Recommended: Devstral (enhanced coding capabilities)
ollama pull devstral:latest

# Alternatives
ollama pull codellama:7b        # Meta's CodeLlama
ollama pull llama3.2:3b         # General purpose, faster
ollama pull deepseek-coder:6.7b # DeepSeek Coder
```

### 4. Start Songbird

```bash
# Launch interactive chat (uses Gemini if API key is set, otherwise Ollama)
songbird

# Use specific provider
songbird --provider gemini
songbird --provider ollama

# Use specific model (or change models during conversation with /model)
/model gemini-2.0-flash-001    # Switch models in conversation
/model qwen2.5-coder:7b        # No restart needed

# Check available providers
songbird --list-providers

# Check version and commands
songbird --help
songbird version
```

For detailed setup instructions, model comparisons, and troubleshooting for all providers, see the **[Complete Provider Guide](docs/providers.md)**.

## Usage Examples

```bash
# Basic chat session (auto-selects best provider)
songbird

# Use Gemini (powerful, cloud-based)
songbird --provider gemini

# Use GitHub Copilot (subscription-based)
songbird --provider copilot

# Use Ollama (private, local)
songbird --provider ollama

# List available providers
songbird --list-providers

# Set default provider (no need to specify --provider every time)
songbird --default               # Interactive menu to set defaults
songbird default gemini          # Set Gemini as default provider
songbird default copilot gpt-4o  # Set Copilot with specific model
songbird default ollama qwen2.5-coder:7b  # Set Ollama with specific model

# After setting defaults, just use:
songbird                 # Uses your configured defaults

# Session management
songbird --continue      # Continue latest session
songbird --resume        # Pick from previous sessions

# Print mode - One-off commands with clean output
songbird -p "What is 2+2?"                    # Shows tool execution + answer
songbird -p "Create a Python hello world"     # Transparent mode (default)
songbird -p "List files in current directory" # Shows tools used

# Ultra-quiet mode - Final answer only (perfect for scripting)
songbird -p "What is 2+2?" --quiet           # Only shows: "4"
songbird -p "Create hello.py" --quiet         # Only shows: "File created successfully..."
songbird -p "Current time" --quiet            # Clean output for automation

# Show available commands
songbird --help

# Display version
songbird version
```

### In-Chat Commands

Once in a conversation, use these powerful commands:

```bash
# Model switching (no session restart needed!)
/model                    # See available models and switch interactively
/model devstral:latest    # Switch to specific model directly
/model gemini-2.0-flash-001  # Switch to Gemini model

# Help and information
/help                     # Show all available commands
/help model               # Get help for specific command
/                         # Quick command menu

# Session management
/clear                    # Clear conversation history
/clear --force            # Clear without confirmation
```

## Configuration

### Default Provider & Model Setup

Songbird lets you set default providers and models so you don't have to specify them every time:

```bash
# Interactive setup - shows menu to choose provider and model
songbird --default

# Direct setup - set provider only (uses provider's default model)
songbird default gemini
songbird default copilot  
songbird default ollama

# Direct setup - set both provider and specific model
songbird default openai gpt-4o-mini
songbird default claude claude-3-5-sonnet-20241022
songbird default gemini gemini-2.0-flash-001
songbird default ollama qwen2.5-coder:7b
songbird default copilot gpt-4.1-2025-04-14

# After setting defaults, simply run:
songbird  # Uses your configured provider and model
```

### Configuration File

Songbird stores your preferences in `~/.songbird/config.json`:

```json
{
  "llm": {
    "default_provider": "gemini",
    "default_models": {
      "openai": "gpt-4o",
      "claude": "claude-3-5-sonnet-20241022", 
      "gemini": "gemini-2.0-flash-001",
      "ollama": "qwen2.5-coder:7b",
      "copilot": "gpt-4.1-2025-04-14"
    }
  }
}
```

### Environment Variable Overrides

You can override any configuration with environment variables:

```bash
export SONGBIRD_DEFAULT_PROVIDER="gemini"
export SONGBIRD_MAX_TOKENS=8192
export SONGBIRD_TEMPERATURE=0.3
```



## Development

### Prerequisites

- Python 3.10 or higher
- [uv](https://docs.astral.sh/uv/) (recommended) or pip

### Setup

```bash
# Clone the repository
git clone https://github.com/Spandan7724/songbird
cd songbird

# Install with uv (recommended)
uv sync
uv pip install -e .

# Or with traditional tools
python -m venv .venv
source .venv/bin/activate  # Windows: .venv\Scripts\activate
pip install -e .
```

### Testing

```bash
# Run all tests
pytest

# Run with coverage
pytest --cov=songbird

# Run specific test file
pytest tests/test_cli.py -v

# Run LLM integration tests (requires Ollama)
pytest tests/llm/ -v
```

### Building

```bash
# Build wheel and source distribution
python -m build

# Test local installation
uv tool install ./dist/songbird_ai-*.whl
```



## Troubleshooting

<details>
<summary>Ollama Connection Issues</summary>

```bash
# Check if Ollama is running
curl http://localhost:11434/api/tags

# Restart Ollama service
ollama serve

# Check available models
ollama list
```
</details>

<details>
<summary>Model Not Found Errors</summary>

```bash
# Pull the required model
ollama pull qwen2.5-coder:7b

# List available models
ollama list
```
</details>

<details>
<summary>Gemini API Issues</summary>

```bash
# Check if API key is set
echo $GEMINI_API_KEY

# Get a free API key
# Visit: https://aistudio.google.com/app/apikey

# Set API key permanently
echo 'export GEMINI_API_KEY="your-key-here"' >> ~/.bashrc
source ~/.bashrc

# Test Gemini provider
songbird --provider gemini
```
</details>

<details>
<summary>GitHub Copilot Issues</summary>

```bash
# Check if access token is set
echo $COPILOT_ACCESS_TOKEN

# Get access token from VS Code
# Command Palette → "GitHub Copilot: Generate Access Token"

# Get access token from JetBrains IDE
# Settings → Tools → GitHub Copilot → Generate Token

# Set token temporarily
export COPILOT_ACCESS_TOKEN="ghu_xxxxxxxxxxxxxxxxxxxx"

# Set token permanently
echo 'export COPILOT_ACCESS_TOKEN="your-token-here"' >> ~/.bashrc
source ~/.bashrc

# Test Copilot provider
songbird --provider copilot

# Check available models
songbird --provider copilot --list-providers
```
</details>

<details>
<summary>Installation Issues</summary>

```bash
# Update pipx
pipx upgrade songbird-ai

# Or reinstall
pipx uninstall songbird-ai
pipx install songbird-ai
```
</details>

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Acknowledgments

- [Ollama](https://ollama.ai) - Local LLM runtime
- [Typer](https://typer.tiangolo.com) - CLI framework
- [Rich](https://rich.readthedocs.io) - Terminal formatting
- [ripgrep](https://github.com/BurntSushi/ripgrep) - Fast text search

---


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "songbird-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, assistant, cli, coding, llm, terminal",
    "author": "Spandan Chavan",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/89/5b/5766e542e1ff775cb8a0dfa95f6e011b20f71047447e45d8f46cffa0e7cf/songbird_ai-0.1.14.1.tar.gz",
    "platform": null,
    "description": "# Songbird\n\n<div align=\"center\">\n\n<pre style=\"color: #4A90E2;\">\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2557   \u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \n\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2557  \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d \u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2551   \u2588\u2588\u2551\u2588\u2588\u2554\u2588\u2588\u2557 \u2588\u2588\u2551\u2588\u2588\u2551  \u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551  \u2588\u2588\u2551\n\u255a\u2550\u2550\u2550\u2550\u2588\u2588\u2551\u2588\u2588\u2551   \u2588\u2588\u2551\u2588\u2588\u2551\u255a\u2588\u2588\u2557\u2588\u2588\u2551\u2588\u2588\u2551   \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551  \u2588\u2588\u2551\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2551\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551 \u255a\u2588\u2588\u2588\u2588\u2551\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551\u2588\u2588\u2551  \u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\n\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u255d  \u255a\u2550\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u255d\u255a\u2550\u255d  \u255a\u2550\u255d\u255a\u2550\u2550\u2550\u2550\u2550\u255d\n</pre>\n\n**A terminal-based AI coding companion with 11 professional tools, smart task management, and persistent memory**\n\n[![CI](https://github.com/Spandan7724/songbird/workflows/CI/badge.svg)](https://github.com/Spandan7724/songbird/actions/workflows/ci.yml)\n[![PyPI version](https://badge.fury.io/py/songbird-ai.svg)](https://badge.fury.io/py/songbird-ai)\n[![Python 3.10+](https://img.shields.io/badge/python-3.10%2B-blue.svg)](https://www.python.org/downloads/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\n\n</div>\n\n\n## Downloading\n\n### With uv (recommended)\n```bash\nuv tool install songbird-ai\n```\n### With pipx (if available)\n```bash\npipx install songbird-ai\n```\n### With pip (traditional)\n```bash\npip install songbird-ai\n```\n\n\n## Quick Start\n\n\n\n### Option 1: With Gemini (Recommended)\n\n```bash\n# Install Songbird\npipx install songbird-ai\n\n# Get your free Gemini API key\n# Visit: https://aistudio.google.com/app/apikey\n\n# Set your API key\nexport GEMINI_API_KEY=\"your-api-key-here\"\n\n# Start coding with AI\nsongbird\n\n# Set your default provider (one-time setup)\nsongbird --default\n\n# Continue your previous session\nsongbird --continue\n\n# Resume from any previous session\nsongbird --resume\n```\n\n### Option 2: With GitHub Copilot\n\n```bash\n# Install Songbird\npipx install songbird-ai\n\n# Get your GitHub Copilot access token from VS Code or JetBrains IDE\n# VS Code: Command Palette \u2192 \"GitHub Copilot: Generate Access Token\"\n# JetBrains: Settings \u2192 Tools \u2192 GitHub Copilot \u2192 Generate Token\n\n# Set your access token\nexport COPILOT_ACCESS_TOKEN=\"your-copilot-token-here\"\n\n# Start coding with GitHub Copilot\nsongbird --provider copilot\n\n# Set Copilot as your default (one-time setup)\nsongbird default copilot\n\n# Continue previous session with Copilot\nsongbird --provider copilot --continue\n```\n\n### Option 3: With Local Ollama\n\n```bash\n# Install Songbird\npipx install songbird-ai\n\n# Install Ollama\ncurl -fsSL https://ollama.ai/install.sh | sh\n\n# Start Ollama and pull a model\nollama serve\nollama pull devstral:latest\n\n# Start coding with AI\nsongbird --provider ollama\n\n# Set Ollama as your default (one-time setup)  \nsongbird default ollama\n\n# Continue previous session with Ollama\nsongbird --provider ollama --continue\n```\n\n## Features\n\n\n**11 Professional Tools** for complete development workflows:\n- **Enhanced File Operations**: `file_search`, `file_read`, `file_create`, `file_edit` with syntax highlighting and diff previews\n- **Smart Task Management**: `todo_read`, `todo_write` with automatic prioritization and session persistence\n- **Advanced File Discovery**: `glob` pattern matching, `grep` regex search, enhanced `ls` directory listing\n- **Atomic Operations**: `multi_edit` for safe bulk file changes with rollback capabilities\n- **Shell Integration**: `shell_exec` with live output streaming and cross-platform support\n\n### **Intelligent Task Management**\n- **LLM-Powered Auto-Completion**: Automatically detects and completes tasks from natural language - just say \"I implemented the JWT tokens\" and the system intelligently marks related todos as complete\n- **Session-Aware Todos**: Create, track, and complete development tasks with automatic priority assignment\n- **Smart Prioritization**: AI analyzes task content to assign appropriate priority levels\n- **Clean Visual Display**: Simple bullet points with strikethrough for completed tasks\n- **Semantic Understanding**: The LLM understands context - \"JWT token system\" matches \"JWT tokens for authentication\"\n\n**Smart Todo Management Example:**\n```\n# Create todos naturally\n\"I need to implement JWT authentication and user registration\"\n\u2713 Creates: \"Implement JWT authentication\" and \"Add user registration\"\n\n# Complete todos intelligently  \n\"I finished the JWT token system and it's working\"\n\u2713 Auto-completes: \"Implement JWT authentication\" (semantic match)\n\u2713 Shows updated list with strikethrough for completed items\n\n# No manual marking needed - just describe what you did!\n``` \n\n### **Advanced File Discovery & Search**\n- **Glob Patterns**: Find files with patterns like `**/*.py`, `src/**/*.js`, `*test*.{py,js}`\n- **Regex Content Search**: Powerful regex search with context lines and highlighting\n- **Enhanced Directory Listing**: Rich formatted output with sorting and metadata\n- **Smart File Detection**: Automatically detects filename vs content searches\n- **Type-Specific Search**: Filter by file extensions (py, js, md, txt, json, yaml, etc.)\n\n### **Atomic Multi-File Operations**\n- **Bulk Editing**: Edit multiple files simultaneously with safety guarantees\n- **Beautiful Previews**: Unified diff display for all changes before applying\n- **Rollback Protection**: Automatic rollback if any operation fails\n- **Atomic Transactions**: All-or-nothing approach ensures consistency\n\n### **Persistent Memory System**\n- **Session Persistence**: Automatic conversation saving with project-aware storage\n- **Seamless Continuation**: Resume exactly where you left off with `--continue`\n- **Session Browser**: Interactive menu to select from previous sessions with `--resume`\n- **Project Isolation**: Each git repository gets separate session storage\n- **Visual Replay**: Perfect restoration of conversation history with tool outputs\n\n### **Dynamic Command System**\n- **In-Chat Commands**: Type `/` for instant command access without leaving conversation\n- **Real-Time Model Switching**: Change models with `/model` command - no session restart needed\n- **Model Persistence**: Model changes automatically save and restore across sessions\n- **Help System**: Comprehensive `/help` command with examples and documentation\n- **Session Management**: `/clear` command for conversation management\n\n### **Multi-Provider AI Support**\n- **6 AI Providers**: OpenAI, Anthropic Claude, Google Gemini, GitHub Copilot, Ollama, and OpenRouter\n- **Automatic Provider Selection**: Intelligent fallback based on available API keys\n- **Cloud & Local**: Use powerful cloud models or private local models\n- **Dynamic Switching**: Switch models and providers instantly during conversations\n\n### **Safety & Security**\n- **Repository Sandboxing**: Cannot access files outside your project\n- **Diff Previews**: Review all changes before applying with beautiful unified diffs\n- **Atomic Operations**: Safe multi-file editing with automatic rollback\n- **Input Validation**: Comprehensive validation for all tool operations\n\n\n## Installation\n\n### Recommended: pipx (for CLI tools)\n\n```bash\n# Install with pipx (isolated, globally available)\npipx install songbird-ai\n\n# Verify installation\nsongbird --help\n```\n\n### Alternative: uv (fast package manager)\n\n```bash\n# Install with uv\nuv tool install songbird-ai\n\n# Verify installation\nsongbird --help\n```\n\n### Traditional: pip\n\n```bash\n# Install with pip (may conflict with other packages)\npip install songbird-ai\n```\n\n##  Getting Started\n\n### 1. Install Ollama\n\n<details>\n<summary> Linux/WSL</summary>\n\n```bash\ncurl -fsSL https://ollama.ai/install.sh | sh\n```\n</details>\n\n<details>\n<summary> macOS</summary>\n\n```bash\n# Using Homebrew\nbrew install ollama\n\n# Or download from https://ollama.ai/download\n```\n</details>\n\n<details>\n<summary>Windows</summary>\n\nDownload and install from [https://ollama.ai/download](https://ollama.ai/download)\n</details>\n\n### 2. Start Ollama Server\n\n```bash\nollama serve\n```\n\n### 3. Pull a Coding Model\n\n```bash\n# Recommended: Devstral (enhanced coding capabilities)\nollama pull devstral:latest\n\n# Alternatives\nollama pull codellama:7b        # Meta's CodeLlama\nollama pull llama3.2:3b         # General purpose, faster\nollama pull deepseek-coder:6.7b # DeepSeek Coder\n```\n\n### 4. Start Songbird\n\n```bash\n# Launch interactive chat (uses Gemini if API key is set, otherwise Ollama)\nsongbird\n\n# Use specific provider\nsongbird --provider gemini\nsongbird --provider ollama\n\n# Use specific model (or change models during conversation with /model)\n/model gemini-2.0-flash-001    # Switch models in conversation\n/model qwen2.5-coder:7b        # No restart needed\n\n# Check available providers\nsongbird --list-providers\n\n# Check version and commands\nsongbird --help\nsongbird version\n```\n\nFor detailed setup instructions, model comparisons, and troubleshooting for all providers, see the **[Complete Provider Guide](docs/providers.md)**.\n\n## Usage Examples\n\n```bash\n# Basic chat session (auto-selects best provider)\nsongbird\n\n# Use Gemini (powerful, cloud-based)\nsongbird --provider gemini\n\n# Use GitHub Copilot (subscription-based)\nsongbird --provider copilot\n\n# Use Ollama (private, local)\nsongbird --provider ollama\n\n# List available providers\nsongbird --list-providers\n\n# Set default provider (no need to specify --provider every time)\nsongbird --default               # Interactive menu to set defaults\nsongbird default gemini          # Set Gemini as default provider\nsongbird default copilot gpt-4o  # Set Copilot with specific model\nsongbird default ollama qwen2.5-coder:7b  # Set Ollama with specific model\n\n# After setting defaults, just use:\nsongbird                 # Uses your configured defaults\n\n# Session management\nsongbird --continue      # Continue latest session\nsongbird --resume        # Pick from previous sessions\n\n# Print mode - One-off commands with clean output\nsongbird -p \"What is 2+2?\"                    # Shows tool execution + answer\nsongbird -p \"Create a Python hello world\"     # Transparent mode (default)\nsongbird -p \"List files in current directory\" # Shows tools used\n\n# Ultra-quiet mode - Final answer only (perfect for scripting)\nsongbird -p \"What is 2+2?\" --quiet           # Only shows: \"4\"\nsongbird -p \"Create hello.py\" --quiet         # Only shows: \"File created successfully...\"\nsongbird -p \"Current time\" --quiet            # Clean output for automation\n\n# Show available commands\nsongbird --help\n\n# Display version\nsongbird version\n```\n\n### In-Chat Commands\n\nOnce in a conversation, use these powerful commands:\n\n```bash\n# Model switching (no session restart needed!)\n/model                    # See available models and switch interactively\n/model devstral:latest    # Switch to specific model directly\n/model gemini-2.0-flash-001  # Switch to Gemini model\n\n# Help and information\n/help                     # Show all available commands\n/help model               # Get help for specific command\n/                         # Quick command menu\n\n# Session management\n/clear                    # Clear conversation history\n/clear --force            # Clear without confirmation\n```\n\n## Configuration\n\n### Default Provider & Model Setup\n\nSongbird lets you set default providers and models so you don't have to specify them every time:\n\n```bash\n# Interactive setup - shows menu to choose provider and model\nsongbird --default\n\n# Direct setup - set provider only (uses provider's default model)\nsongbird default gemini\nsongbird default copilot  \nsongbird default ollama\n\n# Direct setup - set both provider and specific model\nsongbird default openai gpt-4o-mini\nsongbird default claude claude-3-5-sonnet-20241022\nsongbird default gemini gemini-2.0-flash-001\nsongbird default ollama qwen2.5-coder:7b\nsongbird default copilot gpt-4.1-2025-04-14\n\n# After setting defaults, simply run:\nsongbird  # Uses your configured provider and model\n```\n\n### Configuration File\n\nSongbird stores your preferences in `~/.songbird/config.json`:\n\n```json\n{\n  \"llm\": {\n    \"default_provider\": \"gemini\",\n    \"default_models\": {\n      \"openai\": \"gpt-4o\",\n      \"claude\": \"claude-3-5-sonnet-20241022\", \n      \"gemini\": \"gemini-2.0-flash-001\",\n      \"ollama\": \"qwen2.5-coder:7b\",\n      \"copilot\": \"gpt-4.1-2025-04-14\"\n    }\n  }\n}\n```\n\n### Environment Variable Overrides\n\nYou can override any configuration with environment variables:\n\n```bash\nexport SONGBIRD_DEFAULT_PROVIDER=\"gemini\"\nexport SONGBIRD_MAX_TOKENS=8192\nexport SONGBIRD_TEMPERATURE=0.3\n```\n\n\n\n## Development\n\n### Prerequisites\n\n- Python 3.10 or higher\n- [uv](https://docs.astral.sh/uv/) (recommended) or pip\n\n### Setup\n\n```bash\n# Clone the repository\ngit clone https://github.com/Spandan7724/songbird\ncd songbird\n\n# Install with uv (recommended)\nuv sync\nuv pip install -e .\n\n# Or with traditional tools\npython -m venv .venv\nsource .venv/bin/activate  # Windows: .venv\\Scripts\\activate\npip install -e .\n```\n\n### Testing\n\n```bash\n# Run all tests\npytest\n\n# Run with coverage\npytest --cov=songbird\n\n# Run specific test file\npytest tests/test_cli.py -v\n\n# Run LLM integration tests (requires Ollama)\npytest tests/llm/ -v\n```\n\n### Building\n\n```bash\n# Build wheel and source distribution\npython -m build\n\n# Test local installation\nuv tool install ./dist/songbird_ai-*.whl\n```\n\n\n\n## Troubleshooting\n\n<details>\n<summary>Ollama Connection Issues</summary>\n\n```bash\n# Check if Ollama is running\ncurl http://localhost:11434/api/tags\n\n# Restart Ollama service\nollama serve\n\n# Check available models\nollama list\n```\n</details>\n\n<details>\n<summary>Model Not Found Errors</summary>\n\n```bash\n# Pull the required model\nollama pull qwen2.5-coder:7b\n\n# List available models\nollama list\n```\n</details>\n\n<details>\n<summary>Gemini API Issues</summary>\n\n```bash\n# Check if API key is set\necho $GEMINI_API_KEY\n\n# Get a free API key\n# Visit: https://aistudio.google.com/app/apikey\n\n# Set API key permanently\necho 'export GEMINI_API_KEY=\"your-key-here\"' >> ~/.bashrc\nsource ~/.bashrc\n\n# Test Gemini provider\nsongbird --provider gemini\n```\n</details>\n\n<details>\n<summary>GitHub Copilot Issues</summary>\n\n```bash\n# Check if access token is set\necho $COPILOT_ACCESS_TOKEN\n\n# Get access token from VS Code\n# Command Palette \u2192 \"GitHub Copilot: Generate Access Token\"\n\n# Get access token from JetBrains IDE\n# Settings \u2192 Tools \u2192 GitHub Copilot \u2192 Generate Token\n\n# Set token temporarily\nexport COPILOT_ACCESS_TOKEN=\"ghu_xxxxxxxxxxxxxxxxxxxx\"\n\n# Set token permanently\necho 'export COPILOT_ACCESS_TOKEN=\"your-token-here\"' >> ~/.bashrc\nsource ~/.bashrc\n\n# Test Copilot provider\nsongbird --provider copilot\n\n# Check available models\nsongbird --provider copilot --list-providers\n```\n</details>\n\n<details>\n<summary>Installation Issues</summary>\n\n```bash\n# Update pipx\npipx upgrade songbird-ai\n\n# Or reinstall\npipx uninstall songbird-ai\npipx install songbird-ai\n```\n</details>\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Acknowledgments\n\n- [Ollama](https://ollama.ai) - Local LLM runtime\n- [Typer](https://typer.tiangolo.com) - CLI framework\n- [Rich](https://rich.readthedocs.io) - Terminal formatting\n- [ripgrep](https://github.com/BurntSushi/ripgrep) - Fast text search\n\n---\n\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A terminal-based AI coding companion that runs on local LLMs",
    "version": "0.1.14.1",
    "project_urls": {
        "Documentation": "https://github.com/Spandan7724/songbird/blob/main/README.md",
        "Homepage": "https://github.com/Spandan7724/songbird",
        "Issues": "https://github.com/Spandan7724/songbird/issues",
        "Repository": "https://github.com/Spandan7724/songbird.git"
    },
    "split_keywords": [
        "ai",
        " assistant",
        " cli",
        " coding",
        " llm",
        " terminal"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7167ec19d88aef30c4383183ff4c2fbc08a6645d0cca59d67570f9996cfa7605",
                "md5": "0bbfa10f19152ac82316d6880481847d",
                "sha256": "4557a8f0829479c3246a8b0b853ad357e5cfaf73f8300bf1bd6a04594786b6dd"
            },
            "downloads": -1,
            "filename": "songbird_ai-0.1.14.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0bbfa10f19152ac82316d6880481847d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 168302,
            "upload_time": "2025-07-14T10:40:38",
            "upload_time_iso_8601": "2025-07-14T10:40:38.802000Z",
            "url": "https://files.pythonhosted.org/packages/71/67/ec19d88aef30c4383183ff4c2fbc08a6645d0cca59d67570f9996cfa7605/songbird_ai-0.1.14.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "895b5766e542e1ff775cb8a0dfa95f6e011b20f71047447e45d8f46cffa0e7cf",
                "md5": "fbd2046f4288ed091bfd4a1764426127",
                "sha256": "1de391bb77f278275590618bf8e07c8a64f37c6585989fc9614346785c91ab38"
            },
            "downloads": -1,
            "filename": "songbird_ai-0.1.14.1.tar.gz",
            "has_sig": false,
            "md5_digest": "fbd2046f4288ed091bfd4a1764426127",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 343847,
            "upload_time": "2025-07-14T10:40:40",
            "upload_time_iso_8601": "2025-07-14T10:40:40.101901Z",
            "url": "https://files.pythonhosted.org/packages/89/5b/5766e542e1ff775cb8a0dfa95f6e011b20f71047447e45d8f46cffa0e7cf/songbird_ai-0.1.14.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-14 10:40:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Spandan7724",
    "github_project": "songbird",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "songbird-ai"
}
        
Elapsed time: 0.63243s