ollama-chat


Nameollama-chat JSON
Version 0.9.52 PyPI version JSON
download
home_pagehttps://github.com/craigahobbs/ollama-chat
SummaryAn Ollama chat web application
upload_time2025-02-26 23:45:41
maintainerNone
docs_urlNone
authorCraig A. Hobbs
requires_pythonNone
licenseMIT
keywords ollama chatbot conversational ai artificial intelligence ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # ollama-chat

[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)

**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large
language models (LLMs).


## Features

- Platform independent - tested on macOS, Windows, and Linux
- Chat with local LLMs (using Ollama) entirely offline
- Prompt commands to include files, images, and URL content
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Multiple concurrent chats
- Conversation templates for repeating prompts with variable substitutions
- Start a conversation or template from the command line
- Download and manage models


## Installation

To get up and running with Ollama Chat follows these steps:

1. Install and start [Ollama](https://ollama.com)

2. Install Ollama Chat

   ~~~
   pip install ollama-chat
   ~~~


## Start Ollama Chat

To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:

~~~
ollama-chat
~~~

A web browser is launched and opens the Ollama Chat application.

By default, a configuration file, "ollama-chat.json", is created in the user's home directory.


### Start a Conversation from the Command Line

To start a conversation from the command line, use the `-m` argument:

~~~
ollama-chat -m "Why is the sky blue?"
~~~


### Start a Template from the Command Line

To start a named template from the command line, use the `-t` and `-v` arguments:

~~~
ollama-chat -t askAristotle -v question "Why is the sky blue?"
~~~


## Conversation Templates

Conversation Templates allow you to repeat the same prompts with different models. Templates can define variables for
use in the template title and prompt text (e.g., `{{var}}`).

There are two ways to create a template. Click "Add Template" from the index page, and a new template is created and
opened in the template editor. The other way is to click "Template" from a conversation view's menu.


## Prompt Commands

Ollama Chat supports special **prompt commands** that allow you to include files, images, and URL content in
your prompt, among other things. The following prompt commands are available:

- `/file` - include a file

  ```
  /file README.md

  Please summarize the README file.
  ```

- `/image` - include an image

  ```
  /image image.jpeg

  Please summarize the image.
  ```

- `/dir` - include files from a directory

  ```
  /dir src/ollama_chat py

  Please provide a summary for each Ollama Chat source file.
  ```

- `/url` - include a URL resource

  ```
  /url https://craigahobbs.github.io/ollama-chat/README.md

  Please summarize the README file.
  ```

- `/do` - execute a conversation template by name

  ```
  /do city-report -v CityState "Seattle, WA"
  ```

- `/?` - list available prompt commands

  ```
  /?
  ```

To get prompt command help use the `-h` option:

```
/file -h
```


## File Format and API Documentation

[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')

[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)


## Development

This package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).
It was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:

~~~
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
~~~

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/craigahobbs/ollama-chat",
    "name": "ollama-chat",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "ollama, chatbot, conversational AI, artificial intelligence, AI",
    "author": "Craig A. Hobbs",
    "author_email": "craigahobbs@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/78/c9/1e08dab598a0a5c66fafc3552b30661e854238e2184520d9a78c984d2564/ollama_chat-0.9.52.tar.gz",
    "platform": null,
    "description": "# ollama-chat\n\n[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)\n\n**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large\nlanguage models (LLMs).\n\n\n## Features\n\n- Platform independent - tested on macOS, Windows, and Linux\n- Chat with local LLMs (using Ollama) entirely offline\n- Prompt commands to include files, images, and URL content\n- Regenerate the most recent conversation response\n- Delete the most recent conversation exchange\n- View responses as Markdown or text\n- Save conversations as Markdown text\n- Multiple concurrent chats\n- Conversation templates for repeating prompts with variable substitutions\n- Start a conversation or template from the command line\n- Download and manage models\n\n\n## Installation\n\nTo get up and running with Ollama Chat follows these steps:\n\n1. Install and start [Ollama](https://ollama.com)\n\n2. Install Ollama Chat\n\n   ~~~\n   pip install ollama-chat\n   ~~~\n\n\n## Start Ollama Chat\n\nTo start Ollama Chat, open a terminal prompt and run the Ollama Chat application:\n\n~~~\nollama-chat\n~~~\n\nA web browser is launched and opens the Ollama Chat application.\n\nBy default, a configuration file, \"ollama-chat.json\", is created in the user's home directory.\n\n\n### Start a Conversation from the Command Line\n\nTo start a conversation from the command line, use the `-m` argument:\n\n~~~\nollama-chat -m \"Why is the sky blue?\"\n~~~\n\n\n### Start a Template from the Command Line\n\nTo start a named template from the command line, use the `-t` and `-v` arguments:\n\n~~~\nollama-chat -t askAristotle -v question \"Why is the sky blue?\"\n~~~\n\n\n## Conversation Templates\n\nConversation Templates allow you to repeat the same prompts with different models. Templates can define variables for\nuse in the template title and prompt text (e.g., `{{var}}`).\n\nThere are two ways to create a template. Click \"Add Template\" from the index page, and a new template is created and\nopened in the template editor. The other way is to click \"Template\" from a conversation view's menu.\n\n\n## Prompt Commands\n\nOllama Chat supports special **prompt commands** that allow you to include files, images, and URL content in\nyour prompt, among other things. The following prompt commands are available:\n\n- `/file` - include a file\n\n  ```\n  /file README.md\n\n  Please summarize the README file.\n  ```\n\n- `/image` - include an image\n\n  ```\n  /image image.jpeg\n\n  Please summarize the image.\n  ```\n\n- `/dir` - include files from a directory\n\n  ```\n  /dir src/ollama_chat py\n\n  Please provide a summary for each Ollama Chat source file.\n  ```\n\n- `/url` - include a URL resource\n\n  ```\n  /url https://craigahobbs.github.io/ollama-chat/README.md\n\n  Please summarize the README file.\n  ```\n\n- `/do` - execute a conversation template by name\n\n  ```\n  /do city-report -v CityState \"Seattle, WA\"\n  ```\n\n- `/?` - list available prompt commands\n\n  ```\n  /?\n  ```\n\nTo get prompt command help use the `-h` option:\n\n```\n/file -h\n```\n\n\n## File Format and API Documentation\n\n[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')\n\n[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)\n\n\n## Development\n\nThis package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).\nIt was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:\n\n~~~\ntemplate-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1\n~~~\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "An Ollama chat web application",
    "version": "0.9.52",
    "project_urls": {
        "Homepage": "https://github.com/craigahobbs/ollama-chat"
    },
    "split_keywords": [
        "ollama",
        " chatbot",
        " conversational ai",
        " artificial intelligence",
        " ai"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a2b589b881a285f5f7806e13063f89df934815bf3ed77df4a484ec02887c76d8",
                "md5": "67ab16eb8bc3226dced841119955517f",
                "sha256": "51e6a0682c62314b7e619c75f9326ac9c7dfe8628feac194440f81c523e7cc5f"
            },
            "downloads": -1,
            "filename": "ollama_chat-0.9.52-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "67ab16eb8bc3226dced841119955517f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 135946,
            "upload_time": "2025-02-26T23:45:39",
            "upload_time_iso_8601": "2025-02-26T23:45:39.232003Z",
            "url": "https://files.pythonhosted.org/packages/a2/b5/89b881a285f5f7806e13063f89df934815bf3ed77df4a484ec02887c76d8/ollama_chat-0.9.52-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "78c91e08dab598a0a5c66fafc3552b30661e854238e2184520d9a78c984d2564",
                "md5": "71f3b926db824cdb73c3507b0207a6a3",
                "sha256": "00e9e86522ee8c1dcac47867d7c8d13b1c1b70fdb0d7c35374f47e29aee64b34"
            },
            "downloads": -1,
            "filename": "ollama_chat-0.9.52.tar.gz",
            "has_sig": false,
            "md5_digest": "71f3b926db824cdb73c3507b0207a6a3",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 134512,
            "upload_time": "2025-02-26T23:45:41",
            "upload_time_iso_8601": "2025-02-26T23:45:41.308720Z",
            "url": "https://files.pythonhosted.org/packages/78/c9/1e08dab598a0a5c66fafc3552b30661e854238e2184520d9a78c984d2564/ollama_chat-0.9.52.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-26 23:45:41",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "craigahobbs",
    "github_project": "ollama-chat",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ollama-chat"
}
        
Elapsed time: 0.45456s