# ollama-chat
[](https://pypi.org/project/ollama-chat/)
[](https://pypi.org/project/ollama-chat/)
[](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)
[](https://pypi.org/project/ollama-chat/)
**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to
interact with local large language models (LLMs) entirely offline. Ideal for AI enthusiasts,
developers, or anyone wanting private, offline LLM chats.
## Features
- Chat with local large language models (LLMs) entirely offline
- ***Prompt Commands*** to include files, images, and URL content
- ***Conversation Templates*** for repeating prompts with variable substitutions
- Browse, download, monitor, and select local models directly in the app
- Multiple concurrent chats
- Show/hide thinking of reasoning models
- Regenerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown or text
- Save conversations as Markdown text
- Start a conversation or template from the command line
- Platform independent - tested on macOS, Windows, and Linux
## Installation
To get up and running with Ollama Chat follow these steps:
1. Install [Ollama](https://ollama.com/download)
2. Install Ollama Chat
**macOS and Linux**
~~~
python3 -m venv $HOME/venv --upgrade-deps
. $HOME/venv/bin/activate
pip install ollama-chat
~~~
**Windows**
~~~
python3 -m venv %USERPROFILE%\venv --upgrade-deps
%USERPROFILE%\venv\Scripts\activate
pip install ollama-chat
~~~
## Start Ollama Chat
To start Ollama Chat, open a terminal prompt and follow the steps for your OS. When you start Ollama
Chat, a web browser is launched and opens the Ollama Chat application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
### macOS and Linux
~~~
. $HOME/venv/bin/activate
ollama-chat
~~~
### Windows
~~~
%USERPROFILE%\venv\Scripts\activate
ollama-chat
~~~
### Start a Conversation from the Command Line
To start a conversation from the command line, use the `-m` argument:
~~~
ollama-chat -m "Why is the sky blue?"
~~~
### Start a Template from the Command Line
To start a named template from the command line, use the `-t` and `-v` arguments:
~~~
ollama-chat -t askAristotle -v question "Why is the sky blue?"
~~~
## Add a Desktop Launcher
To add a desktop launcher, follow the steps for your OS.
### macOS
In Finder, locate the "ollama-chat" executable and drag-and-drop it into the lower portion of the
Dock.
### Windows
In File Explorer, locate the "ollama-chat" executable, right-click it, and select "Pin to Start".
### GNOME (Linux)
1. Copy the following Ollama Chat desktop file contents:
~~~
[Desktop Entry]
Name=Ollama Chat
Exec=sh -c "$HOME/venv/bin/ollama-chat"
Type=Application
Icon=dialog-information
Terminal=true
Categories=Utility;
~~~
2. Create the Ollama Chat desktop file and paste the contents:
~~~
nano $HOME/.local/share/applications/ollama-chat.desktop
~~~
3. Update the "Exec" path, if necessary, and save
## Conversation Templates
Conversation Templates allow you to repeat a sequence of prompts. Templates can include variable
substitutions in the title text and the prompt text (e.g., `{{var}}`).
### Create a Template
There are two ways to create a template:
- Click "Add Template" from the home page
- Click "Template" on a conversation page
### Run a Template
To run a template, click on its title on the home page. If the template has any variables, the user
is prompted for their values prior to running the template. When a template runs, a new conversation
is created and each prompt is entered in sequence.
### Edit a Template
To edit a template, from the home page, click "Select" on the template you want to edit, and then
click "Edit". On the template editor page you can update the template's title, set its name, add or
remove variables, and add or remove prompts.
## Prompt Commands
Ollama Chat supports special **prompt commands** that allow you to include files, images, and URL
content in your prompt, among other things. The following prompt commands are available:
- `/file` - include a file
```
/file README.md
```
- `/image` - include an image
```
/image image.jpeg
```
- `/dir` - include files from a directory
```
/dir src/ollama_chat py
```
- `/url` - include a URL resource
```
/url https://craigahobbs.github.io/ollama-chat/README.md
```
- `/do` - execute a conversation template by name
```
/do city-report -v CityState "Seattle, WA"
```
- `/?` - list available prompt commands
```
/?
```
To get prompt command help use the `-h` option:
```
/file -h
```
## File Format and API Documentation
[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')
[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)
## Development
This package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).
It was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:
~~~
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
~~~
Raw data
{
"_id": null,
"home_page": "https://github.com/craigahobbs/ollama-chat",
"name": "ollama-chat",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "ollama, chatbot, conversational AI, artificial intelligence, AI",
"author": "Craig A. Hobbs",
"author_email": "craigahobbs@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/b5/b3/3c3e09767e558c939a44c254f1a924dc95ac49c75b41765d3162776db378/ollama_chat-0.9.53.tar.gz",
"platform": null,
"description": "# ollama-chat\n\n[](https://pypi.org/project/ollama-chat/)\n[](https://pypi.org/project/ollama-chat/)\n[](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)\n[](https://pypi.org/project/ollama-chat/)\n\n**Ollama Chat** is a conversational AI chat client that uses [Ollama](https://ollama.com) to\ninteract with local large language models (LLMs) entirely offline. Ideal for AI enthusiasts,\ndevelopers, or anyone wanting private, offline LLM chats.\n\n\n## Features\n\n- Chat with local large language models (LLMs) entirely offline\n- ***Prompt Commands*** to include files, images, and URL content\n- ***Conversation Templates*** for repeating prompts with variable substitutions\n- Browse, download, monitor, and select local models directly in the app\n- Multiple concurrent chats\n- Show/hide thinking of reasoning models\n- Regenerate the most recent conversation response\n- Delete the most recent conversation exchange\n- View responses as Markdown or text\n- Save conversations as Markdown text\n- Start a conversation or template from the command line\n- Platform independent - tested on macOS, Windows, and Linux\n\n\n## Installation\n\nTo get up and running with Ollama Chat follow these steps:\n\n1. Install [Ollama](https://ollama.com/download)\n\n2. Install Ollama Chat\n\n **macOS and Linux**\n\n ~~~\n python3 -m venv $HOME/venv --upgrade-deps\n . $HOME/venv/bin/activate\n pip install ollama-chat\n ~~~\n\n **Windows**\n\n ~~~\n python3 -m venv %USERPROFILE%\\venv --upgrade-deps\n %USERPROFILE%\\venv\\Scripts\\activate\n pip install ollama-chat\n ~~~\n\n\n## Start Ollama Chat\n\nTo start Ollama Chat, open a terminal prompt and follow the steps for your OS. When you start Ollama\nChat, a web browser is launched and opens the Ollama Chat application.\n\nBy default, a configuration file, \"ollama-chat.json\", is created in the user's home directory.\n\n### macOS and Linux\n\n~~~\n. $HOME/venv/bin/activate\nollama-chat\n~~~\n\n### Windows\n\n~~~\n%USERPROFILE%\\venv\\Scripts\\activate\nollama-chat\n~~~\n\n\n### Start a Conversation from the Command Line\n\nTo start a conversation from the command line, use the `-m` argument:\n\n~~~\nollama-chat -m \"Why is the sky blue?\"\n~~~\n\n\n### Start a Template from the Command Line\n\nTo start a named template from the command line, use the `-t` and `-v` arguments:\n\n~~~\nollama-chat -t askAristotle -v question \"Why is the sky blue?\"\n~~~\n\n\n## Add a Desktop Launcher\n\nTo add a desktop launcher, follow the steps for your OS.\n\n\n### macOS\n\nIn Finder, locate the \"ollama-chat\" executable and drag-and-drop it into the lower portion of the\nDock.\n\n\n### Windows\n\nIn File Explorer, locate the \"ollama-chat\" executable, right-click it, and select \"Pin to Start\".\n\n\n### GNOME (Linux)\n\n1. Copy the following Ollama Chat desktop file contents:\n\n ~~~\n [Desktop Entry]\n Name=Ollama Chat\n Exec=sh -c \"$HOME/venv/bin/ollama-chat\"\n Type=Application\n Icon=dialog-information\n Terminal=true\n Categories=Utility;\n ~~~\n\n2. Create the Ollama Chat desktop file and paste the contents:\n\n ~~~\n nano $HOME/.local/share/applications/ollama-chat.desktop\n ~~~\n\n3. Update the \"Exec\" path, if necessary, and save\n\n\n## Conversation Templates\n\nConversation Templates allow you to repeat a sequence of prompts. Templates can include variable\nsubstitutions in the title text and the prompt text (e.g., `{{var}}`).\n\n\n### Create a Template\n\nThere are two ways to create a template:\n\n- Click \"Add Template\" from the home page\n- Click \"Template\" on a conversation page\n\n\n### Run a Template\n\nTo run a template, click on its title on the home page. If the template has any variables, the user\nis prompted for their values prior to running the template. When a template runs, a new conversation\nis created and each prompt is entered in sequence.\n\n\n### Edit a Template\n\nTo edit a template, from the home page, click \"Select\" on the template you want to edit, and then\nclick \"Edit\". On the template editor page you can update the template's title, set its name, add or\nremove variables, and add or remove prompts.\n\n\n## Prompt Commands\n\nOllama Chat supports special **prompt commands** that allow you to include files, images, and URL\ncontent in your prompt, among other things. The following prompt commands are available:\n\n- `/file` - include a file\n\n ```\n /file README.md\n ```\n\n- `/image` - include an image\n\n ```\n /image image.jpeg\n ```\n\n- `/dir` - include files from a directory\n\n ```\n /dir src/ollama_chat py\n ```\n\n- `/url` - include a URL resource\n\n ```\n /url https://craigahobbs.github.io/ollama-chat/README.md\n ```\n\n- `/do` - execute a conversation template by name\n\n ```\n /do city-report -v CityState \"Seattle, WA\"\n ```\n\n- `/?` - list available prompt commands\n\n ```\n /?\n ```\n\nTo get prompt command help use the `-h` option:\n\n```\n/file -h\n```\n\n\n## File Format and API Documentation\n\n[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')\n\n[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)\n\n\n## Development\n\nThis package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).\nIt was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:\n\n~~~\ntemplate-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1\n~~~\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An Ollama chat web application",
"version": "0.9.53",
"project_urls": {
"Homepage": "https://github.com/craigahobbs/ollama-chat"
},
"split_keywords": [
"ollama",
" chatbot",
" conversational ai",
" artificial intelligence",
" ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "975f6b1864962d5fe9ccde28e75d3d37d07788fe572a06c51f2dc80e22ba20b9",
"md5": "c27e8a6c8fd64bcac9f3896ac9254c9b",
"sha256": "73e23765e5f698a6b00a22a79cfac3fbbdb9efe156981ed78d45784a6ceb329c"
},
"downloads": -1,
"filename": "ollama_chat-0.9.53-py3-none-any.whl",
"has_sig": false,
"md5_digest": "c27e8a6c8fd64bcac9f3896ac9254c9b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 137113,
"upload_time": "2025-03-08T01:01:27",
"upload_time_iso_8601": "2025-03-08T01:01:27.603563Z",
"url": "https://files.pythonhosted.org/packages/97/5f/6b1864962d5fe9ccde28e75d3d37d07788fe572a06c51f2dc80e22ba20b9/ollama_chat-0.9.53-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "b5b33c3e09767e558c939a44c254f1a924dc95ac49c75b41765d3162776db378",
"md5": "d1426c56761d7fe6e500766147f51767",
"sha256": "51c7750e9f4ebed24ce143ef45c0a89b71ff3c6f70bac40744cd1c24f2d04555"
},
"downloads": -1,
"filename": "ollama_chat-0.9.53.tar.gz",
"has_sig": false,
"md5_digest": "d1426c56761d7fe6e500766147f51767",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 136311,
"upload_time": "2025-03-08T01:01:29",
"upload_time_iso_8601": "2025-03-08T01:01:29.456079Z",
"url": "https://files.pythonhosted.org/packages/b5/b3/3c3e09767e558c939a44c254f1a924dc95ac49c75b41765d3162776db378/ollama_chat-0.9.53.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-03-08 01:01:29",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "craigahobbs",
"github_project": "ollama-chat",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "ollama-chat"
}