# Termax
![](https://github.com/huangyz0918/termax/actions/workflows/lint.yml/badge.svg) ![](https://github.com/huangyz0918/termax/actions/workflows/test.yml/badge.svg) [![Downloads](https://static.pepy.tech/badge/termax)](https://pepy.tech/project/termax) ![PyPI - Version](https://img.shields.io/pypi/v/termax)
Similar to GitHub [Copilot CLI](https://docs.github.com/en/copilot/github-copilot-in-the-cli/about-github-copilot-in-the-cli), Termax is a personal AI assistant in your terminal.
<br/>
<p align="center"> <img src="docs/icon_text.svg" alt="..." width=300>
It is featured by:
- 🍼 Personalized Experience: Optimize the command generation with RAG.
- 📐 Various LLMs Support: OpenAI GPT, Anthropic Claude, Google Gemini, Mistral AI, Ollama, and more.
- 🧩 Shell Extensions: Plugin with popular shells like `zsh`, `bash` and `fish`.
- 🕹 Cross Platform: Able to run on Windows, macOS, and Linux.
## Installation
```bash
pip install termax
```
## Quick Start
> [!TIP]
> * After installation, you'll need to configure the LLM (e.g., set the [OpenAI API key](https://beta.openai.com/account/api-keys)).
> * A setup guide will automatically launch the first time you use Termax.
> * Alternatively, you can manually initiate configuration at any time by running `t config` or `termax config`.
> * Please read ℹ️ and warnings under each feature before using to ensure your data safety.
### Ask Commands
You can start using Termax by asking using command `t` or `termax`, for example:
```bash
t show me the top-5 CPU processes
```
You can also use Termax to control your software:
```bash
t play a song using Spotify
```
Here is a more complex example:
![](docs/ask_cmd.gif)
ℹ️ **Note:** Be aware of privacy implications. This feature collects the following info into LLM prompt.
- **System Info:** os, hardware architecture
- **Path Info:** username, current directory, file names in directory
### Guess Commands (experimental)
Termax can predict your next move based on your command history, workspace information, and so on — just try `t guess` or `termax guess` to generate a suggested
command. It's not only smart, it's fun!
```bash
t guess
```
Here is an example that Termax guesses my next move:
![](docs/guess.gif)
> [!WARNING]
>
> This feature automatically inserts the last 15 commands from your shell history into the LLM prompt. If your command history contains sensitive information, please refrain from using this feature to ensure your data remains secure.
## Shell Plugin
We support various shells like `bash`, `zsh`, and `fish`. You can choose to install the plugins by:
```bash
t install -n <plugin>
```
The `<plugin>` can be any of `zsh`, `bash`, or `fish`. With this plugin, you can directly convert natural language into
commands using the `Ctrl + K` shortcut.
![](docs/plugin.gif)
You can also easily uninstall the plugin by:
```bash
t uninstall -n <plugin>
```
Remember to source your shell or restart it after installing or uninstalling plugins to apply changes.
## Configuration
Termax has a global configuration file that you can customize by editing it. Below is an example of setting up Termax with OpenAI:
```
[general] # general configuration
platform = openai # default platform
auto_execute = False # execute the generated commands automatically
show_command = True # show the generated command
storage_size = 2000 # the command history's size, default is 2000
[openai] # platform-related configuration
model = gpt-3.5-turbo # LLM model
api_key = <your API key> # API key
temperature = 0.7
save = False
```
> [!TIP]
> * The configuration file is stored at `<HOME>/.termax`, so as the vector database.
> * For other LLMs than OpenAI, you need to install the client manually.
> * We utilize [ChromaDB](trychroma.com) as the vector database. When using OpenAI, Termax calculates embeddings with OpenAI's `text-embedding-ada-002`. For other cases, we default to Chroma's built-in model.
## Retrieval-Augmented Generation (RAG)
Our system utilizes a straightforward Retrieve and Generate (RAG) approach to enhance user experience continuously. Each time the command generation feature is used, Termax captures and stores a "successful example" in a local vector database. This "successful example" comprises the command that was successfully executed without errors, along with the user's description that prompted this command. These entries serve as valuable references for future command generations based on similar user descriptions.
<br/>
<p align="center"> <img src="docs/rag.svg" alt="..." width=400>
Additionally, we gather external information crucial for effective prompting engineering. This includes system details such as the operating system version and the structure of files in the current workspace. This data is essential for generating precise commands that are compatible with the user's current system environment and are pertinent to file management operations.
## Contributing
For developers, you can install from source code to enable the latest features and bug fixes.
```bash:
cd <root of this project>
pip install -e .
```
We are using [PEP8](https://peps.python.org/pep-0008/) as our coding standard, please read and follow it in case there
are CI errors.
## License
Licensed under the [Apache License, Version 2.0](LICENSE).
Raw data
{
"_id": null,
"home_page": "https://github.com/huangyz0918/termax",
"name": "termax",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "LLM, deep learning, MLOps, shell, neural networks, command line, terminal, autocomplete",
"author": "Yizheng Huang",
"author_email": "huangyz0918@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/15/e0/81d68c5dafd6ad989d7896cd54d07837a9ea6287c196dcf36cb9af8ba16f/termax-0.1.4.tar.gz",
"platform": null,
"description": "# Termax\n\n![](https://github.com/huangyz0918/termax/actions/workflows/lint.yml/badge.svg) ![](https://github.com/huangyz0918/termax/actions/workflows/test.yml/badge.svg) [![Downloads](https://static.pepy.tech/badge/termax)](https://pepy.tech/project/termax) ![PyPI - Version](https://img.shields.io/pypi/v/termax)\n\n\nSimilar to GitHub [Copilot CLI](https://docs.github.com/en/copilot/github-copilot-in-the-cli/about-github-copilot-in-the-cli), Termax is a personal AI assistant in your terminal.\n\n<br/>\n<p align=\"center\"> <img src=\"docs/icon_text.svg\" alt=\"...\" width=300>\n\nIt is featured by:\n\n- \ud83c\udf7c Personalized Experience: Optimize the command generation with RAG.\n- \ud83d\udcd0 Various LLMs Support: OpenAI GPT, Anthropic Claude, Google Gemini, Mistral AI, Ollama, and more.\n- \ud83e\udde9 Shell Extensions: Plugin with popular shells like `zsh`, `bash` and `fish`.\n- \ud83d\udd79 Cross Platform: Able to run on Windows, macOS, and Linux.\n\n## Installation\n\n```bash\npip install termax\n```\n\n## Quick Start\n\n> [!TIP]\n> * After installation, you'll need to configure the LLM (e.g., set the [OpenAI API key](https://beta.openai.com/account/api-keys)).\n> * A setup guide will automatically launch the first time you use Termax. \n> * Alternatively, you can manually initiate configuration at any time by running `t config` or `termax config`.\n> * Please read \u2139\ufe0f and warnings under each feature before using to ensure your data safety.\n\n\n### Ask Commands\n\nYou can start using Termax by asking using command `t` or `termax`, for example:\n\n```bash\nt show me the top-5 CPU processes\n```\n\nYou can also use Termax to control your software:\n\n```bash\nt play a song using Spotify\n```\n\nHere is a more complex example:\n\n![](docs/ask_cmd.gif)\n\n\u2139\ufe0f **Note:** Be aware of privacy implications. This feature collects the following info into LLM prompt.\n- **System Info:** os, hardware architecture\n- **Path Info:** username, current directory, file names in directory\n\n\n### Guess Commands (experimental)\n\nTermax can predict your next move based on your command history, workspace information, and so on \u2014 just try `t guess` or `termax guess` to generate a suggested\ncommand. It's not only smart, it's fun!\n\n```bash\nt guess\n```\n\nHere is an example that Termax guesses my next move:\n\n![](docs/guess.gif)\n\n> [!WARNING]\n> \n> This feature automatically inserts the last 15 commands from your shell history into the LLM prompt. If your command history contains sensitive information, please refrain from using this feature to ensure your data remains secure.\n\n\n## Shell Plugin\n\nWe support various shells like `bash`, `zsh`, and `fish`. You can choose to install the plugins by:\n\n```bash\nt install -n <plugin>\n```\n\nThe `<plugin>` can be any of `zsh`, `bash`, or `fish`. With this plugin, you can directly convert natural language into\ncommands using the `Ctrl + K` shortcut.\n\n![](docs/plugin.gif)\n\nYou can also easily uninstall the plugin by:\n\n```bash\nt uninstall -n <plugin>\n```\n\nRemember to source your shell or restart it after installing or uninstalling plugins to apply changes.\n\n## Configuration\n\nTermax has a global configuration file that you can customize by editing it. Below is an example of setting up Termax with OpenAI:\n\n```\n[general] # general configuration\nplatform = openai # default platform\nauto_execute = False # execute the generated commands automatically\nshow_command = True # show the generated command\nstorage_size = 2000 # the command history's size, default is 2000\n\n[openai] # platform-related configuration\nmodel = gpt-3.5-turbo # LLM model\napi_key = <your API key> # API key\ntemperature = 0.7\nsave = False\n```\n\n> [!TIP]\n> * The configuration file is stored at `<HOME>/.termax`, so as the vector database.\n> * For other LLMs than OpenAI, you need to install the client manually.\n> * We utilize [ChromaDB](trychroma.com) as the vector database. When using OpenAI, Termax calculates embeddings with OpenAI's `text-embedding-ada-002`. For other cases, we default to Chroma's built-in model.\n\n## Retrieval-Augmented Generation (RAG)\n\nOur system utilizes a straightforward Retrieve and Generate (RAG) approach to enhance user experience continuously. Each time the command generation feature is used, Termax captures and stores a \"successful example\" in a local vector database. This \"successful example\" comprises the command that was successfully executed without errors, along with the user's description that prompted this command. These entries serve as valuable references for future command generations based on similar user descriptions.\n\n<br/>\n<p align=\"center\"> <img src=\"docs/rag.svg\" alt=\"...\" width=400>\n\nAdditionally, we gather external information crucial for effective prompting engineering. This includes system details such as the operating system version and the structure of files in the current workspace. This data is essential for generating precise commands that are compatible with the user's current system environment and are pertinent to file management operations.\n\n## Contributing\n\nFor developers, you can install from source code to enable the latest features and bug fixes.\n\n```bash:\ncd <root of this project>\npip install -e .\n```\n\nWe are using [PEP8](https://peps.python.org/pep-0008/) as our coding standard, please read and follow it in case there\nare CI errors.\n\n## License\n\nLicensed under the [Apache License, Version 2.0](LICENSE).\n\n\n\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Boost your terminal's intelligence with LLM: intuitive command assistance, seamless customization, and predictive next steps at your fingertips.",
"version": "0.1.4",
"project_urls": {
"Download": "https://github.com/huangyz0918/termax/archive/refs/heads/main.zip",
"Homepage": "https://github.com/huangyz0918/termax"
},
"split_keywords": [
"llm",
" deep learning",
" mlops",
" shell",
" neural networks",
" command line",
" terminal",
" autocomplete"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b2b74819da80abf572992382cbac9e738a4ed1553228232d3003e93c10aa8560",
"md5": "d6ae899eb115c29b2086e210a4c5d122",
"sha256": "82b5f6df7e29c061f2185f02089a91623a52905694a8502eb2dedc672f0ac1e9"
},
"downloads": -1,
"filename": "termax-0.1.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d6ae899eb115c29b2086e210a4c5d122",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 37504,
"upload_time": "2024-04-28T02:51:48",
"upload_time_iso_8601": "2024-04-28T02:51:48.570741Z",
"url": "https://files.pythonhosted.org/packages/b2/b7/4819da80abf572992382cbac9e738a4ed1553228232d3003e93c10aa8560/termax-0.1.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "15e081d68c5dafd6ad989d7896cd54d07837a9ea6287c196dcf36cb9af8ba16f",
"md5": "fa5f0645a40985df5868479e65e1874f",
"sha256": "c8ddaf5c3ac63887bd905a0dde11eb56ed8f27860f0b4dc8c34ace9819010e27"
},
"downloads": -1,
"filename": "termax-0.1.4.tar.gz",
"has_sig": false,
"md5_digest": "fa5f0645a40985df5868479e65e1874f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 29042,
"upload_time": "2024-04-28T02:51:50",
"upload_time_iso_8601": "2024-04-28T02:51:50.143241Z",
"url": "https://files.pythonhosted.org/packages/15/e0/81d68c5dafd6ad989d7896cd54d07837a9ea6287c196dcf36cb9af8ba16f/termax-0.1.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-28 02:51:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "huangyz0918",
"github_project": "termax",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "termax"
}