termaitrik


Nametermaitrik JSON
Version 1.5.1 PyPI version JSON
download
home_pageNone
SummaryTerminal AI assistant inspired by Warp AI, supporting Ollama and BYOK
upload_time2025-09-02 21:57:50
maintainerTrikketto
docs_urlNone
authorTrikketto
requires_python>=3.10
licenseNone
keywords ai terminal ollama cli assistant
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TermAI — AI assistant for your terminal (Ollama + BYOK)

> Version: 1.5.1

**TermAI** is a terminal AI assistant (CLI + local server) inspired by Warp AI.
It works *offline* with Ollama and supports **BYOK** (Bring Your Own Key) for cloud providers
(OpenAI or compatible endpoints).

## Features

- `termai chat ["initial message"]` — interactive general chat with conversation history (optional streaming)
- `termai suggest "...description..."` — generates **shell commands** with reasoning and risks
- `termai explain --cmd "command"` — explains what a command does
- `termai fix --cmd "command" --error "stderr"` — suggests a fix
- `termai run "...description..."` — suggests a command and asks for confirmation before **executing** it
- `termai agent "...goal..."` — experimental multi‑step iterative assistant (proposes & confirms each command)
- `termai install-shell [--shell SHELL]` — install shell alias
- `termai uninstall-shell` — uninstall shell alias
- Local FastAPI server: `uvicorn termai.server:app --host 127.0.0.1 --port 8765`

## Requirements

- Python 3.10+
- (Optional) **Ollama** at `http://127.0.0.1:11434`
- (Optional) cloud provider key for BYOK (`OPENAI_API_KEY` etc.)

## Installation & Quick Start

Install from PyPI:

```bash
pip install termaitrik
termai --help
```

Or run without installing:

```bash
uvx --from termaitrik termai --help
```

For development (from source):

```bash
pip install -r requirements.txt
pip install -e .
termai --help
```

The shell integration alias (see below) provides resilient fallbacks that work across all install modes.

## Configuration

Create `~/.termai/config.yaml` (see `examples/config.example.yaml`).

Minimal example (Ollama):
```yaml
default_provider: ollama
model: llama3.1:8b
ollama:
  host: http://127.0.0.1:11434
```

BYOK example (OpenAI):
```yaml
default_provider: openai
model: gpt-4o-mini
openai:
  api_key: sk-...
  base_url: https://api.openai.com/v1
```

> Useful environment variables: `TERMAI_PROVIDER`, `TERMAI_MODEL`,
> `OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OLLAMA_HOST`.

## Shell Integration (alias `ai`)

Install a resilient alias that falls back across install modes:

### Option 1: Using TermAI CLI (Recommended)

```bash
termai install-shell
# then reopen your shell (or: source ~/.bashrc | ~/.zshrc | fish config)
ai suggest "create a tar archive of the current folder"
```

Uninstall:
```bash
termai uninstall-shell
```

### Option 2: Using scripts directly

```bash
bash scripts/install-shell-integration.sh
# then reopen your shell (or: source ~/.bashrc | ~/.zshrc | fish config)
ai suggest "create a tar archive of the current folder"
```

Uninstall:
```bash
bash scripts/uninstall-shell-integration.sh
```

### Shell Integration Details

The alias resolution order:
1. Global/venv command `termai`
2. Importable module: `python -m termai.cli`
3. PyPI package via `uvx termai`
4. Local repository via `uvx --from <repo>` (development)

## Local Server API

Start:
```bash
uvicorn termai.server:app --host 127.0.0.1 --port 8765
```

Endpoints:
- `GET /health` → `{ "ok": true }`
- `POST /v1/chat` → body:
  ```json
  {"messages":[{"role":"user","content":"hi"}],"provider":"ollama","model":"llama3.1:8b"}
  ```
  Response: `{ "content": "..." }`

## Safety & Warnings

- Always inspect and confirm suggested commands before executing.
- Redact secrets before sending errors or stack traces to providers.
- Local models (Ollama) keep data on your machine; cloud providers transmit prompts off‑device.

## Changelog Highlights 1.0.0

- Added robust shell integration with multi-fallback launcher.
- Stabilized command set (`chat`, `suggest`, `explain`, `fix`, `run`, plus info/examples helpers).
- Improved provider error handling & streaming.
- Config merging & environment variable expansion.
- Extended test coverage across core flows.

MIT License.

## Experimental Agent Mode

You can try an early iterative "agent" loop that plans and executes several shell commands with your confirmation between steps:

```bash
termai agent "list the latest 5 created files, then show the first"
```

Workflow per step:
1. Model emits JSON: `{ "thought", "command", "explanation", "done" }`.
2. You confirm/modify/abort; if accepted the command runs locally.
3. Stdout/stderr are summarized and appended to the conversation as an observation.
4. Loop continues until `done=true`, command empty, or max steps reached (default 6).

Options (current minimal POC):
```
--steps N          maximum steps (default 6)
--dry-run          never execute commands (records hypothetical observations)
--model / -m       override configured model
--temperature / -t sampling temperature (default 0.1)
```

Roadmap ideas (not yet implemented): whitelist & yolo auto‑approval modes, danger pattern guard, transcript export, richer tool schema. Feedback welcome.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "termaitrik",
    "maintainer": "Trikketto",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": null,
    "keywords": "ai, terminal, ollama, cli, assistant",
    "author": "Trikketto",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/25/55/b47ca0daf676e3cfc7075a7f9533852f05b2de7945a27d788bc5bbf18d36/termaitrik-1.5.1.tar.gz",
    "platform": null,
    "description": "# TermAI \u2014 AI assistant for your terminal (Ollama + BYOK)\n\n> Version: 1.5.1\n\n**TermAI** is a terminal AI assistant (CLI + local server) inspired by Warp AI.\nIt works *offline* with Ollama and supports **BYOK** (Bring Your Own Key) for cloud providers\n(OpenAI or compatible endpoints).\n\n## Features\n\n- `termai chat [\"initial message\"]` \u2014 interactive general chat with conversation history (optional streaming)\n- `termai suggest \"...description...\"` \u2014 generates **shell commands** with reasoning and risks\n- `termai explain --cmd \"command\"` \u2014 explains what a command does\n- `termai fix --cmd \"command\" --error \"stderr\"` \u2014 suggests a fix\n- `termai run \"...description...\"` \u2014 suggests a command and asks for confirmation before **executing** it\n- `termai agent \"...goal...\"` \u2014 experimental multi\u2011step iterative assistant (proposes & confirms each command)\n- `termai install-shell [--shell SHELL]` \u2014 install shell alias\n- `termai uninstall-shell` \u2014 uninstall shell alias\n- Local FastAPI server: `uvicorn termai.server:app --host 127.0.0.1 --port 8765`\n\n## Requirements\n\n- Python 3.10+\n- (Optional) **Ollama** at `http://127.0.0.1:11434`\n- (Optional) cloud provider key for BYOK (`OPENAI_API_KEY` etc.)\n\n## Installation & Quick Start\n\nInstall from PyPI:\n\n```bash\npip install termaitrik\ntermai --help\n```\n\nOr run without installing:\n\n```bash\nuvx --from termaitrik termai --help\n```\n\nFor development (from source):\n\n```bash\npip install -r requirements.txt\npip install -e .\ntermai --help\n```\n\nThe shell integration alias (see below) provides resilient fallbacks that work across all install modes.\n\n## Configuration\n\nCreate `~/.termai/config.yaml` (see `examples/config.example.yaml`).\n\nMinimal example (Ollama):\n```yaml\ndefault_provider: ollama\nmodel: llama3.1:8b\nollama:\n  host: http://127.0.0.1:11434\n```\n\nBYOK example (OpenAI):\n```yaml\ndefault_provider: openai\nmodel: gpt-4o-mini\nopenai:\n  api_key: sk-...\n  base_url: https://api.openai.com/v1\n```\n\n> Useful environment variables: `TERMAI_PROVIDER`, `TERMAI_MODEL`,\n> `OPENAI_API_KEY`, `OPENAI_BASE_URL`, `OLLAMA_HOST`.\n\n## Shell Integration (alias `ai`)\n\nInstall a resilient alias that falls back across install modes:\n\n### Option 1: Using TermAI CLI (Recommended)\n\n```bash\ntermai install-shell\n# then reopen your shell (or: source ~/.bashrc | ~/.zshrc | fish config)\nai suggest \"create a tar archive of the current folder\"\n```\n\nUninstall:\n```bash\ntermai uninstall-shell\n```\n\n### Option 2: Using scripts directly\n\n```bash\nbash scripts/install-shell-integration.sh\n# then reopen your shell (or: source ~/.bashrc | ~/.zshrc | fish config)\nai suggest \"create a tar archive of the current folder\"\n```\n\nUninstall:\n```bash\nbash scripts/uninstall-shell-integration.sh\n```\n\n### Shell Integration Details\n\nThe alias resolution order:\n1. Global/venv command `termai`\n2. Importable module: `python -m termai.cli`\n3. PyPI package via `uvx termai`\n4. Local repository via `uvx --from <repo>` (development)\n\n## Local Server API\n\nStart:\n```bash\nuvicorn termai.server:app --host 127.0.0.1 --port 8765\n```\n\nEndpoints:\n- `GET /health` \u2192 `{ \"ok\": true }`\n- `POST /v1/chat` \u2192 body:\n  ```json\n  {\"messages\":[{\"role\":\"user\",\"content\":\"hi\"}],\"provider\":\"ollama\",\"model\":\"llama3.1:8b\"}\n  ```\n  Response: `{ \"content\": \"...\" }`\n\n## Safety & Warnings\n\n- Always inspect and confirm suggested commands before executing.\n- Redact secrets before sending errors or stack traces to providers.\n- Local models (Ollama) keep data on your machine; cloud providers transmit prompts off\u2011device.\n\n## Changelog Highlights 1.0.0\n\n- Added robust shell integration with multi-fallback launcher.\n- Stabilized command set (`chat`, `suggest`, `explain`, `fix`, `run`, plus info/examples helpers).\n- Improved provider error handling & streaming.\n- Config merging & environment variable expansion.\n- Extended test coverage across core flows.\n\nMIT License.\n\n## Experimental Agent Mode\n\nYou can try an early iterative \"agent\" loop that plans and executes several shell commands with your confirmation between steps:\n\n```bash\ntermai agent \"list the latest 5 created files, then show the first\"\n```\n\nWorkflow per step:\n1. Model emits JSON: `{ \"thought\", \"command\", \"explanation\", \"done\" }`.\n2. You confirm/modify/abort; if accepted the command runs locally.\n3. Stdout/stderr are summarized and appended to the conversation as an observation.\n4. Loop continues until `done=true`, command empty, or max steps reached (default 6).\n\nOptions (current minimal POC):\n```\n--steps N          maximum steps (default 6)\n--dry-run          never execute commands (records hypothetical observations)\n--model / -m       override configured model\n--temperature / -t sampling temperature (default 0.1)\n```\n\nRoadmap ideas (not yet implemented): whitelist & yolo auto\u2011approval modes, danger pattern guard, transcript export, richer tool schema. Feedback welcome.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Terminal AI assistant inspired by Warp AI, supporting Ollama and BYOK",
    "version": "1.5.1",
    "project_urls": {
        "Homepage": "https://github.com/Trikketto/termai_project",
        "Issues": "https://github.com/Trikketto/termai_project/issues",
        "Repository": "https://github.com/Trikketto/termai_project"
    },
    "split_keywords": [
        "ai",
        " terminal",
        " ollama",
        " cli",
        " assistant"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5c0605f0773ee929aa7fa9d9c421b30ff91a4ca8a61404f5a3b89d8c56833a91",
                "md5": "769587795773806576b6a510010e4ace",
                "sha256": "34a8af57efa69fba17d24039eed07b7eab235455e6a3acf5f1caf447ee53da87"
            },
            "downloads": -1,
            "filename": "termaitrik-1.5.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "769587795773806576b6a510010e4ace",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 20148,
            "upload_time": "2025-09-02T21:57:49",
            "upload_time_iso_8601": "2025-09-02T21:57:49.409304Z",
            "url": "https://files.pythonhosted.org/packages/5c/06/05f0773ee929aa7fa9d9c421b30ff91a4ca8a61404f5a3b89d8c56833a91/termaitrik-1.5.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "2555b47ca0daf676e3cfc7075a7f9533852f05b2de7945a27d788bc5bbf18d36",
                "md5": "6a056d3e50f0c5cdfa168f5802b83d9d",
                "sha256": "fd2068715c0c7b5dddd1cac22315b8c2cca881d953425b35227ab9264e410d8a"
            },
            "downloads": -1,
            "filename": "termaitrik-1.5.1.tar.gz",
            "has_sig": false,
            "md5_digest": "6a056d3e50f0c5cdfa168f5802b83d9d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 22878,
            "upload_time": "2025-09-02T21:57:50",
            "upload_time_iso_8601": "2025-09-02T21:57:50.546125Z",
            "url": "https://files.pythonhosted.org/packages/25/55/b47ca0daf676e3cfc7075a7f9533852f05b2de7945a27d788bc5bbf18d36/termaitrik-1.5.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-02 21:57:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Trikketto",
    "github_project": "termai_project",
    "github_not_found": true,
    "lcname": "termaitrik"
}
        
Elapsed time: 0.48849s