terminal-buddy


Nameterminal-buddy JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryA terminal assistant powered by On-Device LLM
upload_time2025-08-31 00:18:57
maintainerNone
docs_urlNone
authorAtharva Choudhary
requires_python<4.0,>=3.12
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TBuddy - Terminal Assistant Powered by On-Device LLM

[![Python 3.12+](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/)
[![Poetry](https://img.shields.io/badge/poetry-managed-orange.svg)](https://python-poetry.org/)
[![License](https://img.shields.io/badge/license-GPL%20v3-blue.svg)](LICENSE)

TBuddy is an intelligent terminal assistant that converts natural language queries into bash commands using on-device (<1B params) Large Language Models (LLMs). It provides both a command-line interface and a daemon service for seamless terminal command generation.

## 🚀 Features

- **Natural Language to Bash Commands**: Convert plain English descriptions into executable bash commands
- **On-Device LLM Integration**: Uses Ollama with extremely small sub 1Billion parameter local models for balancing privacy, speed, memory usage and accuracy
- **Semantic Example Selection**: Leverages vector embeddings to find relevant command examples from a pre-curated list
- **Dual Operation Modes**: 
  - One-off command generation
  - Background daemon service for persistent, faster availability
- **Rich Example Database**: Comprehensive collection of text-to-command examples (available with repo)
- **Safe Command Generation**: Focuses on standard, secure and safe bash commands

## 🏗️ Architecture

### Core Components

```
terminal-buddy/
├── src/terminal_buddy/
│ ├── main.py # CLI interface and server logic
│ └── utils/
│ ├── llm_functions.py # LLM integration with Ollama
│ ├── config.py # Configuration management
│ ├── prompts.py # System prompts and templates
│ └── example_selection.py # Vector-based Example Selection
├── data/examples/
│ └── text_2_command_examples.json # Training examples (included in the repo)
└── tests/ # Test suite
```

### Key Technologies

- **Ollama**: Local LLM inference engine
- **LangChain**: Vector embeddings and example selection
- **ChromaDB**: Vector database for semantic search
- **Typer**: Modern CLI framework
- **Pydantic**: Configuration and data validation

## 📋 Prerequisites

- Python 3.12 or higher
- [Ollama](https://ollama.ai/) installed and running
- Required Ollama models:
  - `qwen3:0.6b` (for command generation)
  - `nomic-embed-text` (for embeddings)

## 🛠️ Installation

### Step 1 - Install Package

#### Using pip installer (Recommended)

```bash
pip install terminal-buddy
```

#### Using Poetry (Build Yourself)

```bash
# Clone the repository
git clone <repository-url>
cd terminal-buddy

# Install dependencies
poetry install

# Install the package
poetry install --with dev
```

### Step 2 - Setup Ollama Models
```bash
# Pull required models
ollama pull qwen3:0.6b
ollama pull nomic-embed-text
```

## 🚀 Usage

TBuddy provides a CLI with multiple commands grouped under query, server, and config.

### Query Commands
##### Generate a command directly

`tb query "list all files in current directory"`

### Server Management

```bash
# Start the server in background mode (default)
tb server up

# Start the server in foreground mode
tb server up --no-daemonize

# Stop the server
tb server down

# Check server status
tb server status
```
### Configuration Management
```
# Show current configuration
tb config show

# Update LLM model
tb config set-llm-model qwen3:0.6b

# Update embeddings model
tb config set-embeddings-model nomic-embed-text

# Update examples file path
tb config set-examples-path ./data/examples/text_2_command_examples.json
```
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "terminal-buddy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.12",
    "maintainer_email": null,
    "keywords": null,
    "author": "Atharva Choudhary",
    "author_email": "atharvasteam@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/10/7b/297c0029c3590d84b95701f40b4ae4bd6702b23a095ae9d0224fe0e03268/terminal_buddy-0.1.2.tar.gz",
    "platform": null,
    "description": "# TBuddy - Terminal Assistant Powered by On-Device LLM\n\n[![Python 3.12+](https://img.shields.io/badge/python-3.12+-blue.svg)](https://www.python.org/downloads/)\n[![Poetry](https://img.shields.io/badge/poetry-managed-orange.svg)](https://python-poetry.org/)\n[![License](https://img.shields.io/badge/license-GPL%20v3-blue.svg)](LICENSE)\n\nTBuddy is an intelligent terminal assistant that converts natural language queries into bash commands using on-device (<1B params) Large Language Models (LLMs). It provides both a command-line interface and a daemon service for seamless terminal command generation.\n\n## \ud83d\ude80 Features\n\n- **Natural Language to Bash Commands**: Convert plain English descriptions into executable bash commands\n- **On-Device LLM Integration**: Uses Ollama with extremely small sub 1Billion parameter local models for balancing privacy, speed, memory usage and accuracy\n- **Semantic Example Selection**: Leverages vector embeddings to find relevant command examples from a pre-curated list\n- **Dual Operation Modes**: \n  - One-off command generation\n  - Background daemon service for persistent, faster availability\n- **Rich Example Database**: Comprehensive collection of text-to-command examples (available with repo)\n- **Safe Command Generation**: Focuses on standard, secure and safe bash commands\n\n## \ud83c\udfd7\ufe0f Architecture\n\n### Core Components\n\n```\nterminal-buddy/\n\u251c\u2500\u2500 src/terminal_buddy/\n\u2502 \u251c\u2500\u2500 main.py # CLI interface and server logic\n\u2502 \u2514\u2500\u2500 utils/\n\u2502 \u251c\u2500\u2500 llm_functions.py # LLM integration with Ollama\n\u2502 \u251c\u2500\u2500 config.py # Configuration management\n\u2502 \u251c\u2500\u2500 prompts.py # System prompts and templates\n\u2502 \u2514\u2500\u2500 example_selection.py # Vector-based Example Selection\n\u251c\u2500\u2500 data/examples/\n\u2502 \u2514\u2500\u2500 text_2_command_examples.json # Training examples (included in the repo)\n\u2514\u2500\u2500 tests/ # Test suite\n```\n\n### Key Technologies\n\n- **Ollama**: Local LLM inference engine\n- **LangChain**: Vector embeddings and example selection\n- **ChromaDB**: Vector database for semantic search\n- **Typer**: Modern CLI framework\n- **Pydantic**: Configuration and data validation\n\n## \ud83d\udccb Prerequisites\n\n- Python 3.12 or higher\n- [Ollama](https://ollama.ai/) installed and running\n- Required Ollama models:\n  - `qwen3:0.6b` (for command generation)\n  - `nomic-embed-text` (for embeddings)\n\n## \ud83d\udee0\ufe0f Installation\n\n### Step 1 - Install Package\n\n#### Using pip installer (Recommended)\n\n```bash\npip install terminal-buddy\n```\n\n#### Using Poetry (Build Yourself)\n\n```bash\n# Clone the repository\ngit clone <repository-url>\ncd terminal-buddy\n\n# Install dependencies\npoetry install\n\n# Install the package\npoetry install --with dev\n```\n\n### Step 2 - Setup Ollama Models\n```bash\n# Pull required models\nollama pull qwen3:0.6b\nollama pull nomic-embed-text\n```\n\n## \ud83d\ude80 Usage\n\nTBuddy provides a CLI with multiple commands grouped under query, server, and config.\n\n### Query Commands\n##### Generate a command directly\n\n`tb query \"list all files in current directory\"`\n\n### Server Management\n\n```bash\n# Start the server in background mode (default)\ntb server up\n\n# Start the server in foreground mode\ntb server up --no-daemonize\n\n# Stop the server\ntb server down\n\n# Check server status\ntb server status\n```\n### Configuration Management\n```\n# Show current configuration\ntb config show\n\n# Update LLM model\ntb config set-llm-model qwen3:0.6b\n\n# Update embeddings model\ntb config set-embeddings-model nomic-embed-text\n\n# Update examples file path\ntb config set-examples-path ./data/examples/text_2_command_examples.json\n```",
    "bugtrack_url": null,
    "license": null,
    "summary": "A terminal assistant powered by On-Device LLM",
    "version": "0.1.2",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6a1285a9d13e8ef7b9bd6d7854d358f5c0493870227630ef4f51276fc1b956cf",
                "md5": "319c41678cd6d0e325234744bbd9c155",
                "sha256": "7c695cd93721b0a682767771cc87fdec05947d57b584679f58f838f15e06048f"
            },
            "downloads": -1,
            "filename": "terminal_buddy-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "319c41678cd6d0e325234744bbd9c155",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.12",
            "size": 22694,
            "upload_time": "2025-08-31T00:18:56",
            "upload_time_iso_8601": "2025-08-31T00:18:56.314471Z",
            "url": "https://files.pythonhosted.org/packages/6a/12/85a9d13e8ef7b9bd6d7854d358f5c0493870227630ef4f51276fc1b956cf/terminal_buddy-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "107b297c0029c3590d84b95701f40b4ae4bd6702b23a095ae9d0224fe0e03268",
                "md5": "c3500f066f5f334995cbaa0dbcb2a70b",
                "sha256": "30ab446967379b981bd5ab21f518d299b81a7606f137d4051f5e00799edc21b2"
            },
            "downloads": -1,
            "filename": "terminal_buddy-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "c3500f066f5f334995cbaa0dbcb2a70b",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.12",
            "size": 22060,
            "upload_time": "2025-08-31T00:18:57",
            "upload_time_iso_8601": "2025-08-31T00:18:57.628589Z",
            "url": "https://files.pythonhosted.org/packages/10/7b/297c0029c3590d84b95701f40b4ae4bd6702b23a095ae9d0224fe0e03268/terminal_buddy-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-31 00:18:57",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "terminal-buddy"
}
        
Elapsed time: 0.87638s