# ollama-chat
[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)
**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large
language models (LLMs).
## Features
- Platform independent - tested on macOS, Windows, and Linux
- Chat with any local Ollama model
- Save conversations for later viewing and interaction
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Multiple concurrent chats
- Prompt commands for including file and URL content
- Conversation templates for repeating prompts with variable substitutions
- Start a converstation or template from the command line
- Download and manage models
## Installation
To get up and running with Ollama Chat follows these steps:
1. Install and start [Ollama](https://ollama.com)
2. Install Ollama Chat
~~~
pip install ollama-chat
~~~
### Updating
To update Ollama Chat:
~~~
pip install -U ollama-chat
~~~
## Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
~~~
ollama-chat
~~~
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
To start a conversation from the command line, use the `-m` argument:
~~~
ollama-chat -m "Why is the sky blue?"
~~~
To start a named template from the command line, use the `-t` and `-v` arguments:
~~~
ollama-chat -t AskAristotle -v Subject "Why is the sky blue?"
~~~
## Conversation Templates
Conversation Templates allow you to repeat the same prompts with different models. Templates can define variables for
use in the template title and prompt text (e.g., `{{var}}`).
There are two ways to create a template. Click "Add Template" from the index page, and a new template is created and
opened in the template editor. The other way is to click "Template" from a conversation view's menu.
## Prompt Commands
Ollama Chat supports special **prompt commands** that allow you to include file and URL content in
your prompt, among other things. The following prompt commands are available:
- `/file` - include a file
```
/file README.md
Please summarize the README file.
```
- `/dir` - include files from a directory
```
/dir src/ollama_chat py
Please provide a summary for each Ollama Chat source file.
```
- `/url` - include a URL resource
```
/url https://craigahobbs.github.io/ollama-chat/README.md
Please summarize the README file.
```
- `/do` - execute a conversation template by name or title
```
/do city-report -v CityState "Seattle, WA"
```
To get prompt command help use the `-h` option:
```
/file -h
```
## File Format and API Documentation
[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')
[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)
## Development
This package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).
It was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:
~~~
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
~~~
Raw data
{
"_id": null,
"home_page": "https://github.com/craigahobbs/ollama-chat",
"name": "ollama-chat",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "ollama, chatbot, conversational AI, artificial intelligence, AI",
"author": "Craig A. Hobbs",
"author_email": "craigahobbs@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/29/d1/4ef632c02ad425ca9eca1e2ad4dd4c3e11fda830d90cb69aea1f069554a9/ollama_chat-0.9.41.tar.gz",
"platform": null,
"description": "# ollama-chat\n\n[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)\n\n**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to interact with local large\nlanguage models (LLMs).\n\n\n## Features\n\n- Platform independent - tested on macOS, Windows, and Linux\n- Chat with any local Ollama model\n- Save conversations for later viewing and interaction\n- Regenerate the most recent conversation response\n- Delete the most recent conversation exchange\n- View responses as Markdown or text\n- Save conversations as Markdown text\n- Multiple concurrent chats\n- Prompt commands for including file and URL content\n- Conversation templates for repeating prompts with variable substitutions\n- Start a converstation or template from the command line\n- Download and manage models\n\n\n## Installation\n\nTo get up and running with Ollama Chat follows these steps:\n\n1. Install and start [Ollama](https://ollama.com)\n\n2. Install Ollama Chat\n\n ~~~\n pip install ollama-chat\n ~~~\n\n\n### Updating\n\nTo update Ollama Chat:\n\n~~~\npip install -U ollama-chat\n~~~\n\n\n## Start Ollama Chat\n\nTo start Ollama Chat, open a terminal prompt and run the Ollama Chat application:\n\n~~~\nollama-chat\n~~~\n\nA web browser is launched and opens the Ollama Chat web application.\n\nBy default, a configuration file, \"ollama-chat.json\", is created in the user's home directory.\n\nTo start a conversation from the command line, use the `-m` argument:\n\n~~~\nollama-chat -m \"Why is the sky blue?\"\n~~~\n\nTo start a named template from the command line, use the `-t` and `-v` arguments:\n\n~~~\nollama-chat -t AskAristotle -v Subject \"Why is the sky blue?\"\n~~~\n\n\n## Conversation Templates\n\nConversation Templates allow you to repeat the same prompts with different models. Templates can define variables for\nuse in the template title and prompt text (e.g., `{{var}}`).\n\nThere are two ways to create a template. Click \"Add Template\" from the index page, and a new template is created and\nopened in the template editor. The other way is to click \"Template\" from a conversation view's menu.\n\n\n## Prompt Commands\n\nOllama Chat supports special **prompt commands** that allow you to include file and URL content in\nyour prompt, among other things. The following prompt commands are available:\n\n- `/file` - include a file\n\n ```\n /file README.md\n\n Please summarize the README file.\n ```\n\n- `/dir` - include files from a directory\n\n ```\n /dir src/ollama_chat py\n\n Please provide a summary for each Ollama Chat source file.\n ```\n\n- `/url` - include a URL resource\n\n ```\n /url https://craigahobbs.github.io/ollama-chat/README.md\n\n Please summarize the README file.\n ```\n\n- `/do` - execute a conversation template by name or title\n\n ```\n /do city-report -v CityState \"Seattle, WA\"\n ```\n\nTo get prompt command help use the `-h` option:\n\n```\n/file -h\n```\n\n\n## File Format and API Documentation\n\n[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')\n\n[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)\n\n\n## Development\n\nThis package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).\nIt was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:\n\n~~~\ntemplate-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1\n~~~\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An Ollama chat web application",
"version": "0.9.41",
"project_urls": {
"Homepage": "https://github.com/craigahobbs/ollama-chat"
},
"split_keywords": [
"ollama",
" chatbot",
" conversational ai",
" artificial intelligence",
" ai"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d2aaabfb67fe54658072ecacd1ca0e7c49b5b0572874b91ccc2e919c96fded65",
"md5": "3dcd5a2f53a053efd6a85f7e8bda7568",
"sha256": "0bf9b0d70ff91cfcfbd9d8f2ddad54cdf89c1149d4d84b3adb5ec3f1fd4c5c71"
},
"downloads": -1,
"filename": "ollama_chat-0.9.41-py3-none-any.whl",
"has_sig": false,
"md5_digest": "3dcd5a2f53a053efd6a85f7e8bda7568",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 29307,
"upload_time": "2024-12-10T14:35:11",
"upload_time_iso_8601": "2024-12-10T14:35:11.011574Z",
"url": "https://files.pythonhosted.org/packages/d2/aa/abfb67fe54658072ecacd1ca0e7c49b5b0572874b91ccc2e919c96fded65/ollama_chat-0.9.41-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "29d14ef632c02ad425ca9eca1e2ad4dd4c3e11fda830d90cb69aea1f069554a9",
"md5": "70bce14919dbc635beb7ccb8033c4897",
"sha256": "e00efca83f482976ada522bd06676af958a5fa6d389a4dde5784caeafbc9b1d5"
},
"downloads": -1,
"filename": "ollama_chat-0.9.41.tar.gz",
"has_sig": false,
"md5_digest": "70bce14919dbc635beb7ccb8033c4897",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 26141,
"upload_time": "2024-12-10T14:35:15",
"upload_time_iso_8601": "2024-12-10T14:35:15.045468Z",
"url": "https://files.pythonhosted.org/packages/29/d1/4ef632c02ad425ca9eca1e2ad4dd4c3e11fda830d90cb69aea1f069554a9/ollama_chat-0.9.41.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-10 14:35:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "craigahobbs",
"github_project": "ollama-chat",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ollama-chat"
}