| Name | shell-ai JSON |
| Version |
0.4.4
JSON |
| download |
| home_page | None |
| Summary | None |
| upload_time | 2025-08-27 12:36:26 |
| maintainer | None |
| docs_url | None |
| author | Rick Lamers |
| requires_python | None |
| license | None |
| keywords |
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# Shell-AI: let AI write your shell commands
[](https://pypi.org/project/shell-ai/)
[](https://opensource.org/licenses/MIT)
[](https://github.com/ricklamers/shell-ai/network)
[](https://github.com/ricklamers/shell-ai/stargazers)
Shell-AI (`shai`) is a CLI utility that brings the power of natural language understanding to your command line. Simply input what you want to do in natural language, and `shai` will suggest single-line commands that achieve your intent. Under the hood, Shell-AI leverages the [LangChain](https://github.com/langchain-ai/langchain) for LLM use and builds on the excellent [InquirerPy](https://github.com/kazhala/InquirerPy) for the interactive CLI.

## Installation
You can install Shell-AI directly from PyPI using pip:
```bash
pip install shell-ai
```
Note that on Linux, Python 3.10 or later is required.
After installation, you can invoke the utility using the `shai` command.
## Usage
To use Shell-AI, open your terminal and type:
```bash
shai run terraform dry run thingy
```
Shell-AI will then suggest 3 commands to fulfill your request:
- `terraform plan`
- `terraform plan -input=false`
- `terraform plan`
## Features
- **Natural Language Input**: Describe what you want to do in plain English (or other supported languages).
- **Command Suggestions**: Get single-line command suggestions that accomplish what you asked for.
- **Cross-Platform**: Works on Linux, macOS, and Windows.
- **Azure Compatibility**: Shell-AI now supports Azure OpenAI deployments.
## Configuration
Shell-AI can be configured through environment variables or a config file located at `~/.config/shell-ai/config.json` (Linux/MacOS) or `%APPDATA%\shell-ai\config.json` (Windows).
### Environment Variables
- `OPENAI_API_KEY`: (Required) Your OpenAI API key, leave empty if you use ollama
- `OPENAI_MODEL`: The OpenAI model to use (default: "gpt-3.5-turbo")
- `OPENAI_API_BASE`: The OpenAI API / OpenAI compatible API endpoint to use (default: None)
- `GROQ_API_KEY`: (Required if using Groq) Your Groq API key
- `SHAI_SUGGESTION_COUNT`: Number of suggestions to generate (default: 3)
- `SHAI_SKIP_CONFIRM`: Skip command confirmation when set to "true"
- `SHAI_SKIP_HISTORY`: Skip writing to shell history when set to "true"
- `SHAI_API_PROVIDER`: Choose between "openai", "ollama", "azure", or "groq" (default: "groq")
- `SHAI_TEMPERATURE`: Controls randomness in the output (default: 0.05). Lower values (e.g., 0.05) make output more focused and deterministic, while higher values (e.g., 0.7) make it more creative and varied.
- `CTX`: Enable context mode when set to "true" (Note: outputs will be sent to the API)
- `OLLAMA_MODEL`: The Ollama model to use (default: "phi3.5")
- `OLLAMA_API_BASE`: The Ollama endpoint to use (default: "http://localhost:11434/v1/")
### Config File Example
```json
{
"OPENAI_API_KEY": "your_openai_api_key_here",
"OPENAI_MODEL": "gpt-3.5-turbo",
"SHAI_SUGGESTION_COUNT": "3",
"CTX": true
}
```
### Config Example for OpenAI compatible
```json
{
"SHAI_API_PROVIDER": "openai",
"OPENAI_API_KEY": "deepseek_api_key",
"OPENAI_API_BASE": "https://api.deepseek.com",
"OPENAI_MODEL": "deekseek-chat",
"SHAI_SUGGESTION_COUNT": "3",
"SHAI_SUGGESTION_COUNT": "3",
"CTX": true
}
```
### Config example for MistralAI
```json
{
"SHAI_API_PROVIDER": "mistral",
"MISTRAL_API_KEY": "mistral_api_key",
"MISTRAL_API_BASE": "https://api.mistral.ai/v1",
"MISTRAL_MODEL": "codestral-2508",
"SHAI_SUGGESTION_COUNT": "3",
"CTX": true
}
```
### Config Example for Ollama
```json
{
"OPENAI_API_KEY":"",
"SHAI_SUGGESTION_COUNT": "3",
"SHAI_API_PROVIDER": "ollama",
"OLLAMA_MODEL": "phi3.5",
"OLLAMA_API_BASE": "http://localhost:11434/v1/",
"SHAI_TEMPERATURE": "0.05"
}
```
The application will read from this file if it exists, overriding any existing environment variables.
Run the application after setting these configurations.
### Using with Groq
To use Shell AI with Groq:
1. Get your API key from Groq
2. Set the following environment variables:
```bash
export SHAI_API_PROVIDER=groq
export GROQ_API_KEY=your_api_key_here
export GROQ_MODEL=llama-3.3-70b-versatile
```
## Contributing
This implementation can be made much smarter! Contribute your ideas as Pull Requests and make AI Shell better for everyone.
Contributions are welcome! Please read the [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## License
Shell-AI is licensed under the MIT License. See [LICENSE](LICENSE) for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "shell-ai",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": null,
"author": "Rick Lamers",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/5f/a3/9e5a19d1c47a535723db2fd9ecbcb1532a257583ff28304ddc8fdc345b7e/shell_ai-0.4.4.tar.gz",
"platform": null,
"description": "# Shell-AI: let AI write your shell commands\n\n[](https://pypi.org/project/shell-ai/)\n[](https://opensource.org/licenses/MIT)\n[](https://github.com/ricklamers/shell-ai/network)\n[](https://github.com/ricklamers/shell-ai/stargazers)\n\nShell-AI (`shai`) is a CLI utility that brings the power of natural language understanding to your command line. Simply input what you want to do in natural language, and `shai` will suggest single-line commands that achieve your intent. Under the hood, Shell-AI leverages the [LangChain](https://github.com/langchain-ai/langchain) for LLM use and builds on the excellent [InquirerPy](https://github.com/kazhala/InquirerPy) for the interactive CLI.\n\n\n\n## Installation\n\nYou can install Shell-AI directly from PyPI using pip:\n\n```bash\npip install shell-ai\n```\n\nNote that on Linux, Python 3.10 or later is required.\n\nAfter installation, you can invoke the utility using the `shai` command.\n\n## Usage\n\nTo use Shell-AI, open your terminal and type:\n\n```bash\nshai run terraform dry run thingy\n```\n\nShell-AI will then suggest 3 commands to fulfill your request:\n- `terraform plan`\n- `terraform plan -input=false`\n- `terraform plan`\n\n## Features\n\n- **Natural Language Input**: Describe what you want to do in plain English (or other supported languages).\n- **Command Suggestions**: Get single-line command suggestions that accomplish what you asked for.\n- **Cross-Platform**: Works on Linux, macOS, and Windows.\n- **Azure Compatibility**: Shell-AI now supports Azure OpenAI deployments.\n\n## Configuration\n\nShell-AI can be configured through environment variables or a config file located at `~/.config/shell-ai/config.json` (Linux/MacOS) or `%APPDATA%\\shell-ai\\config.json` (Windows).\n\n### Environment Variables\n\n- `OPENAI_API_KEY`: (Required) Your OpenAI API key, leave empty if you use ollama\n- `OPENAI_MODEL`: The OpenAI model to use (default: \"gpt-3.5-turbo\")\n- `OPENAI_API_BASE`: The OpenAI API / OpenAI compatible API endpoint to use (default: None)\n- `GROQ_API_KEY`: (Required if using Groq) Your Groq API key\n- `SHAI_SUGGESTION_COUNT`: Number of suggestions to generate (default: 3)\n- `SHAI_SKIP_CONFIRM`: Skip command confirmation when set to \"true\"\n- `SHAI_SKIP_HISTORY`: Skip writing to shell history when set to \"true\"\n- `SHAI_API_PROVIDER`: Choose between \"openai\", \"ollama\", \"azure\", or \"groq\" (default: \"groq\")\n- `SHAI_TEMPERATURE`: Controls randomness in the output (default: 0.05). Lower values (e.g., 0.05) make output more focused and deterministic, while higher values (e.g., 0.7) make it more creative and varied.\n- `CTX`: Enable context mode when set to \"true\" (Note: outputs will be sent to the API)\n- `OLLAMA_MODEL`: The Ollama model to use (default: \"phi3.5\")\n- `OLLAMA_API_BASE`: The Ollama endpoint to use (default: \"http://localhost:11434/v1/\")\n\n### Config File Example\n\n```json\n{\n \"OPENAI_API_KEY\": \"your_openai_api_key_here\",\n \"OPENAI_MODEL\": \"gpt-3.5-turbo\",\n \"SHAI_SUGGESTION_COUNT\": \"3\",\n \"CTX\": true\n}\n```\n\n### Config Example for OpenAI compatible\n\n```json\n{\n \"SHAI_API_PROVIDER\": \"openai\",\n \"OPENAI_API_KEY\": \"deepseek_api_key\",\n \"OPENAI_API_BASE\": \"https://api.deepseek.com\",\n \"OPENAI_MODEL\": \"deekseek-chat\",\n \"SHAI_SUGGESTION_COUNT\": \"3\",\n \"SHAI_SUGGESTION_COUNT\": \"3\",\n \"CTX\": true\n}\n```\n\n### Config example for MistralAI\n\n```json\n{\n \"SHAI_API_PROVIDER\": \"mistral\",\n \"MISTRAL_API_KEY\": \"mistral_api_key\",\n \"MISTRAL_API_BASE\": \"https://api.mistral.ai/v1\",\n \"MISTRAL_MODEL\": \"codestral-2508\",\n \"SHAI_SUGGESTION_COUNT\": \"3\",\n \"CTX\": true\n}\n\n```\n\n### Config Example for Ollama\n\n```json\n {\n \"OPENAI_API_KEY\":\"\",\n \"SHAI_SUGGESTION_COUNT\": \"3\",\n \"SHAI_API_PROVIDER\": \"ollama\",\n \"OLLAMA_MODEL\": \"phi3.5\",\n \"OLLAMA_API_BASE\": \"http://localhost:11434/v1/\",\n \"SHAI_TEMPERATURE\": \"0.05\"\n }\n```\n\nThe application will read from this file if it exists, overriding any existing environment variables.\n\nRun the application after setting these configurations.\n\n### Using with Groq\n\nTo use Shell AI with Groq:\n\n1. Get your API key from Groq\n2. Set the following environment variables:\n ```bash\n export SHAI_API_PROVIDER=groq\n export GROQ_API_KEY=your_api_key_here\n export GROQ_MODEL=llama-3.3-70b-versatile\n ```\n\n## Contributing\n\nThis implementation can be made much smarter! Contribute your ideas as Pull Requests and make AI Shell better for everyone.\n\nContributions are welcome! Please read the [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.\n\n## License\n\nShell-AI is licensed under the MIT License. See [LICENSE](LICENSE) for details.\n",
"bugtrack_url": null,
"license": null,
"summary": null,
"version": "0.4.4",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "6f4e011f0e858315a2df229ad6cb93bcfd0a57440fad3e23bce648d203a4886d",
"md5": "01adb2979c94c76a7f889ca4ecc1f53e",
"sha256": "cb7a89c1889c762ef072759b985c9d30a92c4b62e65fa19bcca6446ef7586193"
},
"downloads": -1,
"filename": "shell_ai-0.4.4-py3-none-any.whl",
"has_sig": false,
"md5_digest": "01adb2979c94c76a7f889ca4ecc1f53e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 11403,
"upload_time": "2025-08-27T12:36:25",
"upload_time_iso_8601": "2025-08-27T12:36:25.338746Z",
"url": "https://files.pythonhosted.org/packages/6f/4e/011f0e858315a2df229ad6cb93bcfd0a57440fad3e23bce648d203a4886d/shell_ai-0.4.4-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5fa39e5a19d1c47a535723db2fd9ecbcb1532a257583ff28304ddc8fdc345b7e",
"md5": "926f4582ee2ffd9852ecbb5650ee6ef4",
"sha256": "5d27ee5e44c8fc8f8b258bc8908b2d3af4c62bf87bfa10cfadd058f3f4bc6a92"
},
"downloads": -1,
"filename": "shell_ai-0.4.4.tar.gz",
"has_sig": false,
"md5_digest": "926f4582ee2ffd9852ecbb5650ee6ef4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 11659,
"upload_time": "2025-08-27T12:36:26",
"upload_time_iso_8601": "2025-08-27T12:36:26.150022Z",
"url": "https://files.pythonhosted.org/packages/5f/a3/9e5a19d1c47a535723db2fd9ecbcb1532a257583ff28304ddc8fdc345b7e/shell_ai-0.4.4.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-27 12:36:26",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "shell-ai"
}