ollama-chat


Nameollama-chat JSON
Version 0.9.42 PyPI version JSON
download
home_pagehttps://github.com/craigahobbs/ollama-chat
SummaryAn Ollama chat web application
upload_time2024-12-31 13:37:19
maintainerNone
docs_urlNone
authorCraig A. Hobbs
requires_pythonNone
licenseMIT
keywords ollama chatbot conversational ai artificial intelligence ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ollama-chat

[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)

**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large
language models (LLMs).


## Features

- Platform independent - tested on macOS, Windows, and Linux
- Chat with any local Ollama model
- Prompt commands to include files, images, and URL content
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Multiple concurrent chats
- Conversation templates for repeating prompts with variable substitutions
- Start a converstation or template from the command line
- Download and manage models


## Installation

To get up and running with Ollama Chat follows these steps:

1. Install and start [Ollama](https://ollama.com)

2. Install Ollama Chat

   ~~~
   pip install ollama-chat
   ~~~


### Updating

To update Ollama Chat:

~~~
pip install -U ollama-chat
~~~


## Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

~~~
ollama-chat
~~~

A web browser is launched and opens the Ollama Chat web application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.

To start a conversation from the command line, use the `-m` argument:

~~~
ollama-chat -m "Why is the sky blue?"
~~~

To start a named template from the command line, use the `-t` and `-v` arguments:

~~~
ollama-chat -t AskAristotle -v Subject "Why is the sky blue?"
~~~


## Conversation Templates

Conversation Templates allow you to repeat the same prompts with different models. Templates can define variables for
use in the template title and prompt text (e.g., `{{var}}`).

There are two ways to create a template. Click "Add Template" from the index page, and a new template is created and
opened in the template editor. The other way is to click "Template" from a conversation view's menu.


## Prompt Commands

Ollama Chat supports special **prompt commands** that allow you to include files, images, and URL content in
your prompt, among other things. The following prompt commands are available:

- `/file` - include a file

  ```
  /file README.md

  Please summarize the README file.
  ```

- `/image` - include an image

  ```
  /image image.jpeg

  Please summarize the image.
  ```

- `/dir` - include files from a directory

  ```
  /dir src/ollama_chat py

  Please provide a summary for each Ollama Chat source file.
  ```

- `/url` - include a URL resource

  ```
  /url https://craigahobbs.github.io/ollama-chat/README.md

  Please summarize the README file.
  ```

- `/do` - execute a conversation template by name

  ```
  /do city-report -v CityState "Seattle, WA"
  ```

To get prompt command help use the `-h` option:

```
/file -h
```


## File Format and API Documentation

[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')

[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)


## Development

This package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).
It was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:

~~~
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
~~~

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/craigahobbs/ollama-chat",
    "name": "ollama-chat",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "ollama, chatbot, conversational AI, artificial intelligence, AI",
    "author": "Craig A. Hobbs",
    "author_email": "craigahobbs@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/00/23/f7919ff495cde519f75bd1851627ce8de163c8c312f7039c8aa938ac8006/ollama_chat-0.9.42.tar.gz",
    "platform": null,
    "description": "# ollama-chat\n\n[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)\n\n**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large\nlanguage models (LLMs).\n\n\n## Features\n\n- Platform independent - tested on macOS, Windows, and Linux\n- Chat with any local Ollama model\n- Prompt commands to include files, images, and URL content\n- Regenerate the most recent conversation response\n- Delete the most recent conversation exchange\n- View responses as Markdown or text\n- Save conversations as Markdown text\n- Multiple concurrent chats\n- Conversation templates for repeating prompts with variable substitutions\n- Start a converstation or template from the command line\n- Download and manage models\n\n\n## Installation\n\nTo get up and running with Ollama Chat follows these steps:\n\n1. Install and start [Ollama](https://ollama.com)\n\n2. Install Ollama Chat\n\n   ~~~\n   pip install ollama-chat\n   ~~~\n\n\n### Updating\n\nTo update Ollama Chat:\n\n~~~\npip install -U ollama-chat\n~~~\n\n\n## Start Ollama Chat\n\nTo start Ollama Chat, open a terminal prompt and run the Ollama Chat application:\n\n~~~\nollama-chat\n~~~\n\nA web browser is launched and opens the Ollama Chat web application.\n\nBy default, a configuration file, \"ollama-chat.json\", is created in the user's home directory.\n\nTo start a conversation from the command line, use the `-m` argument:\n\n~~~\nollama-chat -m \"Why is the sky blue?\"\n~~~\n\nTo start a named template from the command line, use the `-t` and `-v` arguments:\n\n~~~\nollama-chat -t AskAristotle -v Subject \"Why is the sky blue?\"\n~~~\n\n\n## Conversation Templates\n\nConversation Templates allow you to repeat the same prompts with different models. Templates can define variables for\nuse in the template title and prompt text (e.g., `{{var}}`).\n\nThere are two ways to create a template. Click \"Add Template\" from the index page, and a new template is created and\nopened in the template editor. The other way is to click \"Template\" from a conversation view's menu.\n\n\n## Prompt Commands\n\nOllama Chat supports special **prompt commands** that allow you to include files, images, and URL content in\nyour prompt, among other things. The following prompt commands are available:\n\n- `/file` - include a file\n\n  ```\n  /file README.md\n\n  Please summarize the README file.\n  ```\n\n- `/image` - include an image\n\n  ```\n  /image image.jpeg\n\n  Please summarize the image.\n  ```\n\n- `/dir` - include files from a directory\n\n  ```\n  /dir src/ollama_chat py\n\n  Please provide a summary for each Ollama Chat source file.\n  ```\n\n- `/url` - include a URL resource\n\n  ```\n  /url https://craigahobbs.github.io/ollama-chat/README.md\n\n  Please summarize the README file.\n  ```\n\n- `/do` - execute a conversation template by name\n\n  ```\n  /do city-report -v CityState \"Seattle, WA\"\n  ```\n\nTo get prompt command help use the `-h` option:\n\n```\n/file -h\n```\n\n\n## File Format and API Documentation\n\n[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')\n\n[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)\n\n\n## Development\n\nThis package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).\nIt was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:\n\n~~~\ntemplate-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1\n~~~\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An Ollama chat web application",
    "version": "0.9.42",
    "project_urls": {
        "Homepage": "https://github.com/craigahobbs/ollama-chat"
    },
    "split_keywords": [
        "ollama",
        " chatbot",
        " conversational ai",
        " artificial intelligence",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f130f3886c6c482fdde1bac48a06eedfc42c729fbafcc0e5832854f21e925ab6",
                "md5": "9ef7306e74ff0b69fd56d5633372985a",
                "sha256": "d102e82c269b89445f84136e31322d25e7c104c2098599e633090664fb8f7356"
            },
            "downloads": -1,
            "filename": "ollama_chat-0.9.42-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9ef7306e74ff0b69fd56d5633372985a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 29507,
            "upload_time": "2024-12-31T13:37:17",
            "upload_time_iso_8601": "2024-12-31T13:37:17.884869Z",
            "url": "https://files.pythonhosted.org/packages/f1/30/f3886c6c482fdde1bac48a06eedfc42c729fbafcc0e5832854f21e925ab6/ollama_chat-0.9.42-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0023f7919ff495cde519f75bd1851627ce8de163c8c312f7039c8aa938ac8006",
                "md5": "ac26dd535d2a4f7935774594101fd6e5",
                "sha256": "7813ee7fa877d4d4c7b88bda9906e633641e764634b448aa0600fb98a35a7997"
            },
            "downloads": -1,
            "filename": "ollama_chat-0.9.42.tar.gz",
            "has_sig": false,
            "md5_digest": "ac26dd535d2a4f7935774594101fd6e5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 26340,
            "upload_time": "2024-12-31T13:37:19",
            "upload_time_iso_8601": "2024-12-31T13:37:19.019352Z",
            "url": "https://files.pythonhosted.org/packages/00/23/f7919ff495cde519f75bd1851627ce8de163c8c312f7039c8aa938ac8006/ollama_chat-0.9.42.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-31 13:37:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "craigahobbs",
    "github_project": "ollama-chat",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ollama-chat"
}
        
Elapsed time: 0.58220s