ai-terminal


Nameai-terminal JSON
Version 0.1.0rc1 PyPI version JSON
download
home_page
SummaryInterface to interact with a chatbot in Linux terminal
upload_time2024-02-17 20:48:29
maintainer
docs_urlNone
author
requires_python
licenseBSD-3-Clause license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MistralTerminal - chatbot in your Linux terminal

## Overview

MistralTerminal is a command-line interface for interacting with MistralAI. It allows users to send questions directly from the terminal and receive concise answers from MistralAI. This script is designed for simplicity and ease of use. It keeps the conversation history, so you can continue interact with it without giving the context every time. The input can be multiline. If the script suggest code blocks, you can easily copy them in the clipboard. It also supports colored output for enhanced readability.

## License

License: BSD 3 clause

## Author

V.A. Yastrebov, CNRS, MINES Paris - PSL, France, Jan 2024.

## External Contributors

- [Basile Marchand](https://github.com/basileMarchand)

## Requirements

- An API key for MistralAI, check on [mistral.ai](https://mistral.ai)

## Installation

Set your MistralAI API key in your environment:

```bash
export MISTRAL_API_KEY='your_api_key_here'
```

### From Github sources

```bash
git clone https://github.com/vyastreb/ai-terminal.git
cd ai-terminal
```

Install required dependencies and the `ai` command using :

```bash
pip install .
```

### From PyPi

**Comming Soon**

## Options

- `--model/-m`: Sets the MistralAI model to be used (mistral-tiny, mistral-small or mistral-medium). Default is 'mistral-tiny'.
- `--temp/-T`: Sets the temperature for the AI's responses. Default is 0.2.
- `--tokens/-t`: Sets the maximum number of tokens in the response. Default is 2.
- `--verbose/-v`: If set, prints the question or the whole history.
- `--no-chat/-n`: If set, does not keep the discussion in memory.
- `--unit-test/-u`: Unit tests.
- `--help/-h`: Displays the help message and usage instructions.

The default model is 'mistral-tiny', which is a smaller model that is faster to load and run. The default temperature is 0.5, which is a good value for most questions. The default token count is 350, which is a reasonable length for most answers. The verbose option is useful for checking whether your questions and history were correctly parsed by the script.

### Configuration file

All parameters could be defined in the script or in the config file `~/.mistralai/config.json`.
If config file is not found, the script will use default parameters.
If config file exists, it will use parameters from the file, and will ignore parameters defined in the script.
But if you prescribe options in the command line, they will be used instead of the config file.

_Example of config file:_

```
{
 "model": "mistral-tiny",
 "max_memory": 31,
 "max_tokens": 1000,
 "waitingTime": 180,
 "max_line_length": 80,
 "temperature": 0.5
}
```

## Usage

```bash
ai [options]
```

These commands will start a new line `>  ` where your question could be written.

**Examples:**

```bash
ai
> How to convert jpg to png in linux?
```

```bash
ai --temp 0.0
> What is the meaning of life?
```

```bash
ai --model mistral-tiny --temp 0.8 --tokens 5000
> What is the best (according to parisian) cheese in France?
```

```bash
ai -m mistral-small -T 0.8 -t 500 -v
> What is the visible EM spectrum?
```

### Features

- Can be run from anywhere in the terminal
- Supports multi-line input
- Remembers past questions
- If one code block is shown, it automatically stores it in the clipboard
- If several code blocks are shown, it suggests to store the one you want in the clipboard
- Keeps the history of conversation in a local file for some (user defined) time
- Colored output for enhanced readability
- Adjustable parameters for model, temperature, and token count
- Supports multi-line responses with automatic line wrapping

### Notes

- Ensure your terminal supports ANSI color codes for the best experience.
- The history of last messages (31 by default) is stored in `~/.mistralai/history.txt` for 3 minutes by default. You can change the number of stored messages and the time in the script of in config file: `max_memory` and `waitingTime`.


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "ai-terminal",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "",
    "author_email": "vladislav.yastrebov@minesparis.psl.eu",
    "download_url": "https://files.pythonhosted.org/packages/55/85/07d0335c3356664a639db20f341080e806a1225a9e4ee8566f6e4a95a94c/ai-terminal-0.1.0rc1.tar.gz",
    "platform": null,
    "description": "# MistralTerminal - chatbot in your Linux terminal\n\n## Overview\n\nMistralTerminal is a command-line interface for interacting with MistralAI. It allows users to send questions directly from the terminal and receive concise answers from MistralAI. This script is designed for simplicity and ease of use. It keeps the conversation history, so you can continue interact with it without giving the context every time. The input can be multiline. If the script suggest code blocks, you can easily copy them in the clipboard. It also supports colored output for enhanced readability.\n\n## License\n\nLicense: BSD 3 clause\n\n## Author\n\nV.A. Yastrebov, CNRS, MINES Paris - PSL, France, Jan 2024.\n\n## External Contributors\n\n- [Basile Marchand](https://github.com/basileMarchand)\n\n## Requirements\n\n- An API key for MistralAI, check on [mistral.ai](https://mistral.ai)\n\n## Installation\n\nSet your MistralAI API key in your environment:\n\n```bash\nexport MISTRAL_API_KEY='your_api_key_here'\n```\n\n### From Github sources\n\n```bash\ngit clone https://github.com/vyastreb/ai-terminal.git\ncd ai-terminal\n```\n\nInstall required dependencies and the `ai` command using :\n\n```bash\npip install .\n```\n\n### From PyPi\n\n**Comming Soon**\n\n## Options\n\n- `--model/-m`: Sets the MistralAI model to be used (mistral-tiny, mistral-small or mistral-medium). Default is 'mistral-tiny'.\n- `--temp/-T`: Sets the temperature for the AI's responses. Default is 0.2.\n- `--tokens/-t`: Sets the maximum number of tokens in the response. Default is 2.\n- `--verbose/-v`: If set, prints the question or the whole history.\n- `--no-chat/-n`: If set, does not keep the discussion in memory.\n- `--unit-test/-u`: Unit tests.\n- `--help/-h`: Displays the help message and usage instructions.\n\nThe default model is 'mistral-tiny', which is a smaller model that is faster to load and run. The default temperature is 0.5, which is a good value for most questions. The default token count is 350, which is a reasonable length for most answers. The verbose option is useful for checking whether your questions and history were correctly parsed by the script.\n\n### Configuration file\n\nAll parameters could be defined in the script or in the config file `~/.mistralai/config.json`.\nIf config file is not found, the script will use default parameters.\nIf config file exists, it will use parameters from the file, and will ignore parameters defined in the script.\nBut if you prescribe options in the command line, they will be used instead of the config file.\n\n_Example of config file:_\n\n```\n{\n \"model\": \"mistral-tiny\",\n \"max_memory\": 31,\n \"max_tokens\": 1000,\n \"waitingTime\": 180,\n \"max_line_length\": 80,\n \"temperature\": 0.5\n}\n```\n\n## Usage\n\n```bash\nai [options]\n```\n\nThese commands will start a new line `>  ` where your question could be written.\n\n**Examples:**\n\n```bash\nai\n> How to convert jpg to png in linux?\n```\n\n```bash\nai --temp 0.0\n> What is the meaning of life?\n```\n\n```bash\nai --model mistral-tiny --temp 0.8 --tokens 5000\n> What is the best (according to parisian) cheese in France?\n```\n\n```bash\nai -m mistral-small -T 0.8 -t 500 -v\n> What is the visible EM spectrum?\n```\n\n### Features\n\n- Can be run from anywhere in the terminal\n- Supports multi-line input\n- Remembers past questions\n- If one code block is shown, it automatically stores it in the clipboard\n- If several code blocks are shown, it suggests to store the one you want in the clipboard\n- Keeps the history of conversation in a local file for some (user defined) time\n- Colored output for enhanced readability\n- Adjustable parameters for model, temperature, and token count\n- Supports multi-line responses with automatic line wrapping\n\n### Notes\n\n- Ensure your terminal supports ANSI color codes for the best experience.\n- The history of last messages (31 by default) is stored in `~/.mistralai/history.txt` for 3 minutes by default. You can change the number of stored messages and the time in the script of in config file: `max_memory` and `waitingTime`.\n\n",
    "bugtrack_url": null,
    "license": "BSD-3-Clause license",
    "summary": "Interface to interact with a chatbot in Linux terminal",
    "version": "0.1.0rc1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "41afd60273b4c0d8a18135e64fda9ac6307da7e07098078a54267c17ebe520ae",
                "md5": "6e1368427401f50efcb47d83deca572f",
                "sha256": "e4431cba12d49fe662b34157e568e239307af40556ccb4f7b762297a477d3369"
            },
            "downloads": -1,
            "filename": "ai_terminal-0.1.0rc1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6e1368427401f50efcb47d83deca572f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 10251,
            "upload_time": "2024-02-17T20:48:27",
            "upload_time_iso_8601": "2024-02-17T20:48:27.791510Z",
            "url": "https://files.pythonhosted.org/packages/41/af/d60273b4c0d8a18135e64fda9ac6307da7e07098078a54267c17ebe520ae/ai_terminal-0.1.0rc1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "558507d0335c3356664a639db20f341080e806a1225a9e4ee8566f6e4a95a94c",
                "md5": "be0c25e6a779c3faec344b2477dfc239",
                "sha256": "3fcf188df383005fb032ffe0c0d7cbc72807524293d01f473284326a75dd00c3"
            },
            "downloads": -1,
            "filename": "ai-terminal-0.1.0rc1.tar.gz",
            "has_sig": false,
            "md5_digest": "be0c25e6a779c3faec344b2477dfc239",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 9842,
            "upload_time": "2024-02-17T20:48:29",
            "upload_time_iso_8601": "2024-02-17T20:48:29.750123Z",
            "url": "https://files.pythonhosted.org/packages/55/85/07d0335c3356664a639db20f341080e806a1225a9e4ee8566f6e4a95a94c/ai-terminal-0.1.0rc1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-17 20:48:29",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ai-terminal"
}
        
Elapsed time: 0.17470s