elia-chat


Nameelia-chat JSON
Version 1.5.0 PyPI version JSON
download
home_pageNone
SummaryA powerful terminal user interface for interacting with large language models.
upload_time2024-05-14 20:52:43
maintainerNone
docs_urlNone
authorNone
requires_python>=3.11
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center">
    <img src="https://github.com/darrenburns/elia/assets/5740731/4037b91a-1ad8-4d5b-884d-b3f1b495acf4" width="126px">
</h1>
<p align="center">
  <i align="center">A snappy, keyboard-centric terminal user interface for interacting with large language models.</i><br>
  <i align="center">Chat with Claude 3, ChatGPT, and local models like Llama 3, Phi 3, Mistral and Gemma.</i>
</p>

![elia-screenshot-collage](https://github.com/darrenburns/elia/assets/5740731/75f8563f-ce1a-4c9c-98c0-1bd1f7010814)

## Introduction

`elia` is an application for interacting with LLMs which runs entirely in your terminal, and is designed to be keyboard-focused, efficient, and fun to use!
It stores your conversations in a local SQLite database, and allows you to interact with a variety of models.
Speak with proprietary models such as ChatGPT and Claude, or with local models running through `ollama` or LocalAI.

## Installation

Install Elia with [pipx](https://github.com/pypa/pipx):

```bash
pipx install elia-chat
```

Depending on the model you wish to use, you may need to set one or more environment variables (e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc).

## Quickstart

Launch Elia from the command line:

```bash
elia
```

Launch directly a new chat from the command line:

```bash
elia "What is the Zen of Python?"
```

## Running local models

1. Install [`ollama`](https://github.com/ollama/ollama).
2. Pull the model you require, e.g. `ollama pull llama3`.
3. Run the local ollama server: `ollama serve`.
4. Add the model to the config file (see below).

## Configuration

The location of the configuration file is noted at the bottom of
the options window (`ctrl+o`).

The example file below shows the available options, as well as examples of how to add new models.

```toml
# the *ID* for the model that is selected by default on launch
# to use one of the default builtin OpenAI/anthropic models, prefix
# the model name with `elia-`.
default_model = "elia-gpt-3.5-turbo"
# the system prompt on launch
system_prompt = "You are a helpful assistant who talks like a pirate."

# example of adding local llama3 support
# only the `name` field is required here.
[[models]]
name = "ollama/llama3"

# example of a model running on a local server, e.g. LocalAI
[[models]]
name = "openai/some-model"
api_base = "http://localhost:8080/v1"
api_key = "api-key-if-required"

# example of add a groq model, showing some other fields
[[models]]
name = "groq/llama2-70b-4096"
display_name = "Llama 2 70B"  # appears in UI
provider = "Groq"  # appears in UI
temperature = 1.0  # high temp = high variation in output
max_retries = 0  # number of retries on failed request

# example of multiple instances of one model, e.g. you might
# have a 'work' OpenAI org and a 'personal' org.
[[models]]
id = "work-gpt-3.5-turbo"
name = "gpt-3.5-turbo"
display_name = "GPT 3.5 Turbo (Work)"

[[models]]
id = "personal-gpt-3.5-turbo"
name = "gpt-3.5-turbo"
display_name = "GPT 3.5 Turbo (Personal)"
```

## Import from ChatGPT

Export your conversations to a JSON file using the ChatGPT UI, then import them using the `import` command.

```bash
elia import 'path/to/conversations.json'
```

## Wiping the database

```bash
elia reset
```

## Uninstalling

```bash
pipx uninstall elia-chat
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "elia-chat",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Darren Burns <darrenb900@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/60/10/e474116177c19219587e4fcc24e680c81de4cf8bc14b89bbfb3b51ec09cb/elia_chat-1.5.0.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\">\n    <img src=\"https://github.com/darrenburns/elia/assets/5740731/4037b91a-1ad8-4d5b-884d-b3f1b495acf4\" width=\"126px\">\n</h1>\n<p align=\"center\">\n  <i align=\"center\">A snappy, keyboard-centric terminal user interface for interacting with large language models.</i><br>\n  <i align=\"center\">Chat with Claude 3, ChatGPT, and local models like Llama 3, Phi 3, Mistral and Gemma.</i>\n</p>\n\n![elia-screenshot-collage](https://github.com/darrenburns/elia/assets/5740731/75f8563f-ce1a-4c9c-98c0-1bd1f7010814)\n\n## Introduction\n\n`elia` is an application for interacting with LLMs which runs entirely in your terminal, and is designed to be keyboard-focused, efficient, and fun to use!\nIt stores your conversations in a local SQLite database, and allows you to interact with a variety of models.\nSpeak with proprietary models such as ChatGPT and Claude, or with local models running through `ollama` or LocalAI.\n\n## Installation\n\nInstall Elia with [pipx](https://github.com/pypa/pipx):\n\n```bash\npipx install elia-chat\n```\n\nDepending on the model you wish to use, you may need to set one or more environment variables (e.g. `OPENAI_API_KEY`, `ANTHROPIC_API_KEY`, etc).\n\n## Quickstart\n\nLaunch Elia from the command line:\n\n```bash\nelia\n```\n\nLaunch directly a new chat from the command line:\n\n```bash\nelia \"What is the Zen of Python?\"\n```\n\n## Running local models\n\n1. Install [`ollama`](https://github.com/ollama/ollama).\n2. Pull the model you require, e.g. `ollama pull llama3`.\n3. Run the local ollama server: `ollama serve`.\n4. Add the model to the config file (see below).\n\n## Configuration\n\nThe location of the configuration file is noted at the bottom of\nthe options window (`ctrl+o`).\n\nThe example file below shows the available options, as well as examples of how to add new models.\n\n```toml\n# the *ID* for the model that is selected by default on launch\n# to use one of the default builtin OpenAI/anthropic models, prefix\n# the model name with `elia-`.\ndefault_model = \"elia-gpt-3.5-turbo\"\n# the system prompt on launch\nsystem_prompt = \"You are a helpful assistant who talks like a pirate.\"\n\n# example of adding local llama3 support\n# only the `name` field is required here.\n[[models]]\nname = \"ollama/llama3\"\n\n# example of a model running on a local server, e.g. LocalAI\n[[models]]\nname = \"openai/some-model\"\napi_base = \"http://localhost:8080/v1\"\napi_key = \"api-key-if-required\"\n\n# example of add a groq model, showing some other fields\n[[models]]\nname = \"groq/llama2-70b-4096\"\ndisplay_name = \"Llama 2 70B\"  # appears in UI\nprovider = \"Groq\"  # appears in UI\ntemperature = 1.0  # high temp = high variation in output\nmax_retries = 0  # number of retries on failed request\n\n# example of multiple instances of one model, e.g. you might\n# have a 'work' OpenAI org and a 'personal' org.\n[[models]]\nid = \"work-gpt-3.5-turbo\"\nname = \"gpt-3.5-turbo\"\ndisplay_name = \"GPT 3.5 Turbo (Work)\"\n\n[[models]]\nid = \"personal-gpt-3.5-turbo\"\nname = \"gpt-3.5-turbo\"\ndisplay_name = \"GPT 3.5 Turbo (Personal)\"\n```\n\n## Import from ChatGPT\n\nExport your conversations to a JSON file using the ChatGPT UI, then import them using the `import` command.\n\n```bash\nelia import 'path/to/conversations.json'\n```\n\n## Wiping the database\n\n```bash\nelia reset\n```\n\n## Uninstalling\n\n```bash\npipx uninstall elia-chat\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A powerful terminal user interface for interacting with large language models.",
    "version": "1.5.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3aeae6a5db184f0417af19f14914446e093db03b877bd91e6ced0b7f842108f0",
                "md5": "22de17f6bd304ee6371193b27a3b96d6",
                "sha256": "2d39b7b957e3568981a51726d52aef0fd21753d8e0b6a6ff2f92e815b238e8d4"
            },
            "downloads": -1,
            "filename": "elia_chat-1.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "22de17f6bd304ee6371193b27a3b96d6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 36806,
            "upload_time": "2024-05-14T20:52:41",
            "upload_time_iso_8601": "2024-05-14T20:52:41.205213Z",
            "url": "https://files.pythonhosted.org/packages/3a/ea/e6a5db184f0417af19f14914446e093db03b877bd91e6ced0b7f842108f0/elia_chat-1.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6010e474116177c19219587e4fcc24e680c81de4cf8bc14b89bbfb3b51ec09cb",
                "md5": "754f082cf07c786a86e5671ccb4c30bf",
                "sha256": "b17611e8f977ea9fdce9fba5df8ec68582c70d6412c88d94c64b315816b25440"
            },
            "downloads": -1,
            "filename": "elia_chat-1.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "754f082cf07c786a86e5671ccb4c30bf",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 28765,
            "upload_time": "2024-05-14T20:52:43",
            "upload_time_iso_8601": "2024-05-14T20:52:43.408295Z",
            "url": "https://files.pythonhosted.org/packages/60/10/e474116177c19219587e4fcc24e680c81de4cf8bc14b89bbfb3b51ec09cb/elia_chat-1.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-05-14 20:52:43",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "elia-chat"
}
        
Elapsed time: 0.27194s