vity


Namevity JSON
Version 0.2.4 PyPI version JSON
download
home_pageNone
SummaryAI-powered terminal assistant for generating shell commands
upload_time2025-07-16 16:23:30
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT
keywords ai assistant cli shell terminal
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # 🤖 Vity - AI Terminal Assistant

![demo_video](https://github.com/Kaleab-Ayenew/demo_vids/blob/main/loom_720p-_online-video-cutter.com_-_1_.gif)

AI-powered terminal assistant that generates shell commands and provides coding help. Works with OpenAI, Google Gemini, local models (Ollama), and any OpenAI-compatible API.

## ✨ Features

- **🎯 Smart Command Generation**: Describe tasks, get exact commands
- **🤖 Multi-Provider Support**: OpenAI, Google Gemini, Ollama, or any OpenAI-compatible API
- **🏠 Local Model Support**: Run completely offline with Ollama
- **🧠 Context Awareness**: Record terminal sessions for better responses
- **💬 Chat Mode**: Ask questions about errors and commands
- **📹 Session Recording**: Capture terminal output for contextual help

## 🚀 Quick Start

### Install
```bash
curl -LsSf https://raw.githubusercontent.com/kaleab-ayenew/vity/main/install.sh | sh
```

### Configure
```bash
vity config
```
You'll be prompted for:
- **Base URL**: Your LLM provider endpoint
- **API Key**: Your API key (use `NONE` for local models)
- **Model**: Model name to use
- **History Limit**: Lines of terminal history to send (default: 1000)

### Use
```bash
# Generate commands
vity do "find all python files larger than 1MB"
vity do "kill process using port 3000"

# Chat with AI
vity chat "explain this error message"
vity chat "what does chmod 755 do?"

# Use with context
vity record    # Start recording session
# ... work normally ...
vity do "fix this error"  # AI sees your terminal history
exit          # Stop recording
```

## 🔧 Provider Configuration Examples

### OpenAI
```
Base URL: https://api.openai.com/v1
API Key: sk-your-openai-key
Model: gpt-4o-mini
```

### Google Gemini
```
Base URL: https://generativelanguage.googleapis.com/v1beta
API Key: your-gemini-key
Model: gemini-1.5-flash
```

### Ollama (Local)
```
Base URL: http://localhost:11434/v1
API Key: NONE
Model: llama3.2:3b
```

### Other Providers
Works with any OpenAI-compatible API (Anthropic, Together AI, etc.)

## 📋 Requirements

- **Python**: 3.9+
- **OS**: Linux or macOS
- **LLM Provider**: OpenAI, Gemini, Ollama, or compatible API

## 🎯 Commands

| Command | Description |
|---------|-------------|
| `vity do "<task>"` | Generate shell command |
| `vity chat "<question>"` | Chat with AI |
| `vity record` | Start recording session |
| `vity status` | Show recording status |
| `vity config` | Manage configuration |
| `vity config --reset` | Reset configuration |
| `vity install` | Install shell integration |
| `vity reinstall` | Reinstall shell integration |
| `vity uninstall` | Completely remove vity |

## 🔄 Context Recording

For the best experience, use recording to give Vity context:

```bash
vity record          # Start recording
# ... work normally, encounter errors ...
vity do "fix this"   # AI sees your terminal history and errors
exit                 # Stop recording
```

## 🗑️ Uninstalling

```bash
# Remove everything (shell integration, config, logs, chat history)
vity uninstall

# Or force without confirmation
vity uninstall --force
```

Then remove the package:
```bash
pipx uninstall vity  # if installed with pipx
# or
pip uninstall vity   # if installed with pip
```

## 🛠️ Troubleshooting

**Command not found**: Restart terminal or run `source ~/.bashrc`

**API errors**: Check your configuration with `vity config --show`

**Reset everything**: `vity config --reset`

## 🔒 Privacy

- Configuration stored locally in `~/.config/vity/`
- Terminal history only sent during recording or with `-f` flag
- No data stored on external servers (except API calls)

## 📄 License

MIT License - see [LICENSE](LICENSE) file.

---

**Need help?** Run `vity help` or open an issue on [GitHub](https://github.com/kaleab-ayenew/vity/issues).

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "vity",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "ai, assistant, cli, shell, terminal",
    "author": null,
    "author_email": "Kaleab Ayenew <ai@enhance.care>",
    "download_url": "https://files.pythonhosted.org/packages/65/ad/78186c3de56ea958f441d9dc2052180afa61779c73d4e62db10a5067d6db/vity-0.2.4.tar.gz",
    "platform": null,
    "description": "# \ud83e\udd16 Vity - AI Terminal Assistant\n\n![demo_video](https://github.com/Kaleab-Ayenew/demo_vids/blob/main/loom_720p-_online-video-cutter.com_-_1_.gif)\n\nAI-powered terminal assistant that generates shell commands and provides coding help. Works with OpenAI, Google Gemini, local models (Ollama), and any OpenAI-compatible API.\n\n## \u2728 Features\n\n- **\ud83c\udfaf Smart Command Generation**: Describe tasks, get exact commands\n- **\ud83e\udd16 Multi-Provider Support**: OpenAI, Google Gemini, Ollama, or any OpenAI-compatible API\n- **\ud83c\udfe0 Local Model Support**: Run completely offline with Ollama\n- **\ud83e\udde0 Context Awareness**: Record terminal sessions for better responses\n- **\ud83d\udcac Chat Mode**: Ask questions about errors and commands\n- **\ud83d\udcf9 Session Recording**: Capture terminal output for contextual help\n\n## \ud83d\ude80 Quick Start\n\n### Install\n```bash\ncurl -LsSf https://raw.githubusercontent.com/kaleab-ayenew/vity/main/install.sh | sh\n```\n\n### Configure\n```bash\nvity config\n```\nYou'll be prompted for:\n- **Base URL**: Your LLM provider endpoint\n- **API Key**: Your API key (use `NONE` for local models)\n- **Model**: Model name to use\n- **History Limit**: Lines of terminal history to send (default: 1000)\n\n### Use\n```bash\n# Generate commands\nvity do \"find all python files larger than 1MB\"\nvity do \"kill process using port 3000\"\n\n# Chat with AI\nvity chat \"explain this error message\"\nvity chat \"what does chmod 755 do?\"\n\n# Use with context\nvity record    # Start recording session\n# ... work normally ...\nvity do \"fix this error\"  # AI sees your terminal history\nexit          # Stop recording\n```\n\n## \ud83d\udd27 Provider Configuration Examples\n\n### OpenAI\n```\nBase URL: https://api.openai.com/v1\nAPI Key: sk-your-openai-key\nModel: gpt-4o-mini\n```\n\n### Google Gemini\n```\nBase URL: https://generativelanguage.googleapis.com/v1beta\nAPI Key: your-gemini-key\nModel: gemini-1.5-flash\n```\n\n### Ollama (Local)\n```\nBase URL: http://localhost:11434/v1\nAPI Key: NONE\nModel: llama3.2:3b\n```\n\n### Other Providers\nWorks with any OpenAI-compatible API (Anthropic, Together AI, etc.)\n\n## \ud83d\udccb Requirements\n\n- **Python**: 3.9+\n- **OS**: Linux or macOS\n- **LLM Provider**: OpenAI, Gemini, Ollama, or compatible API\n\n## \ud83c\udfaf Commands\n\n| Command | Description |\n|---------|-------------|\n| `vity do \"<task>\"` | Generate shell command |\n| `vity chat \"<question>\"` | Chat with AI |\n| `vity record` | Start recording session |\n| `vity status` | Show recording status |\n| `vity config` | Manage configuration |\n| `vity config --reset` | Reset configuration |\n| `vity install` | Install shell integration |\n| `vity reinstall` | Reinstall shell integration |\n| `vity uninstall` | Completely remove vity |\n\n## \ud83d\udd04 Context Recording\n\nFor the best experience, use recording to give Vity context:\n\n```bash\nvity record          # Start recording\n# ... work normally, encounter errors ...\nvity do \"fix this\"   # AI sees your terminal history and errors\nexit                 # Stop recording\n```\n\n## \ud83d\uddd1\ufe0f Uninstalling\n\n```bash\n# Remove everything (shell integration, config, logs, chat history)\nvity uninstall\n\n# Or force without confirmation\nvity uninstall --force\n```\n\nThen remove the package:\n```bash\npipx uninstall vity  # if installed with pipx\n# or\npip uninstall vity   # if installed with pip\n```\n\n## \ud83d\udee0\ufe0f Troubleshooting\n\n**Command not found**: Restart terminal or run `source ~/.bashrc`\n\n**API errors**: Check your configuration with `vity config --show`\n\n**Reset everything**: `vity config --reset`\n\n## \ud83d\udd12 Privacy\n\n- Configuration stored locally in `~/.config/vity/`\n- Terminal history only sent during recording or with `-f` flag\n- No data stored on external servers (except API calls)\n\n## \ud83d\udcc4 License\n\nMIT License - see [LICENSE](LICENSE) file.\n\n---\n\n**Need help?** Run `vity help` or open an issue on [GitHub](https://github.com/kaleab-ayenew/vity/issues).\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "AI-powered terminal assistant for generating shell commands",
    "version": "0.2.4",
    "project_urls": {
        "Documentation": "https://github.com/kaleab-ayenew/vity#readme",
        "Homepage": "https://github.com/kaleab-ayenew/vity",
        "Issues": "https://github.com/kaleab-ayenew/vity/issues",
        "Repository": "https://github.com/kaleab-ayenew/vity"
    },
    "split_keywords": [
        "ai",
        " assistant",
        " cli",
        " shell",
        " terminal"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "f37f1e7915b4bff59dda18c1da78beac200dc096bdcdad83cc1da17db6b3a1a9",
                "md5": "1471c8a4f6238e2ea1db857b8c6a2d8f",
                "sha256": "62d1c175e0059ab838ef0b071dd464c8c225c62e7c5dc64166aef325fbd9f1e9"
            },
            "downloads": -1,
            "filename": "vity-0.2.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1471c8a4f6238e2ea1db857b8c6a2d8f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 25821,
            "upload_time": "2025-07-16T16:23:29",
            "upload_time_iso_8601": "2025-07-16T16:23:29.957989Z",
            "url": "https://files.pythonhosted.org/packages/f3/7f/1e7915b4bff59dda18c1da78beac200dc096bdcdad83cc1da17db6b3a1a9/vity-0.2.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "65ad78186c3de56ea958f441d9dc2052180afa61779c73d4e62db10a5067d6db",
                "md5": "38b81a82a96b36aa31260d79d28fd2dd",
                "sha256": "5e2ad170d6165bf8beb4cf78ff00e1274364f691d93ea103201ce110be1a2468"
            },
            "downloads": -1,
            "filename": "vity-0.2.4.tar.gz",
            "has_sig": false,
            "md5_digest": "38b81a82a96b36aa31260d79d28fd2dd",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 56463,
            "upload_time": "2025-07-16T16:23:30",
            "upload_time_iso_8601": "2025-07-16T16:23:30.993003Z",
            "url": "https://files.pythonhosted.org/packages/65/ad/78186c3de56ea958f441d9dc2052180afa61779c73d4e62db10a5067d6db/vity-0.2.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-16 16:23:30",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "kaleab-ayenew",
    "github_project": "vity#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "vity"
}
        
Elapsed time: 0.47383s