# ollama-chat
[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)
[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)
**Ollama Chat** is a web chat client for
[Ollama](https://ollama.com)
that allows you to chat locally (and privately) with
[Large Language Models (LLMs)](https://ollama.com/library).
## Features
- Select local model to chat with
- Saves conversations for later viewing and interaction
- Enter single or multiline prompts
- Regnerate the most recent conversation response
- Delete the most recent conversation exchange
- View responses as Markdown text
- Save conversations as Markdown text
- Multiple concurrent chat responses (with proper Ollama configuration)
## Installation
To get up and running with Ollama Chat follows these steps:
1. Install and start [Ollama](https://ollama.com)
2. Install Ollama Chat
~~~
pip install ollama-chat
~~~
### Updating
To update Ollama Chat:
~~~
pip install -U ollama-chat
~~~
## Start Ollama Chat
To start Ollama Chat, open a terminal prompt and run the Ollama Chat application:
~~~
ollama-chat
~~~
A web browser is launched and opens the Ollama Chat web application.
By default, a configuration file, "ollama-chat.json", is created in the user's home directory.
## Start Conversation from CLI
To start a conversation from the command line, use the `-m` argument:
~~~
ollama-chat -m "Why is the sky blue?"
~~~
## File Format and API Documentation
[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')
[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)
## Development
This package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).
It was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:
~~~
template-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1
~~~
Raw data
{
"_id": null,
"home_page": "https://github.com/craigahobbs/ollama-chat",
"name": "ollama-chat",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": "ollama, chatbot, conversational AI, artificial intelligence, AI",
"author": "Craig A. Hobbs",
"author_email": "craigahobbs@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/ba/6c/7c1d2f0fc732b94d9182e4abb1de1ba7b210c2c539ad10be47a4b885b9d1/ollama_chat-0.9.18.tar.gz",
"platform": null,
"description": "# ollama-chat\n\n[![PyPI - Status](https://img.shields.io/pypi/status/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![PyPI](https://img.shields.io/pypi/v/ollama-chat)](https://pypi.org/project/ollama-chat/)\n[![GitHub](https://img.shields.io/github/license/craigahobbs/ollama-chat)](https://github.com/craigahobbs/ollama-chat/blob/main/LICENSE)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ollama-chat)](https://pypi.org/project/ollama-chat/)\n\n**Ollama Chat** is a web chat client for\n[Ollama](https://ollama.com)\nthat allows you to chat locally (and privately) with\n[Large Language Models (LLMs)](https://ollama.com/library).\n\n\n## Features\n\n- Select local model to chat with\n- Saves conversations for later viewing and interaction\n- Enter single or multiline prompts\n- Regnerate the most recent conversation response\n- Delete the most recent conversation exchange\n- View responses as Markdown text\n- Save conversations as Markdown text\n- Multiple concurrent chat responses (with proper Ollama configuration)\n\n\n## Installation\n\nTo get up and running with Ollama Chat follows these steps:\n\n1. Install and start [Ollama](https://ollama.com)\n\n2. Install Ollama Chat\n\n ~~~\n pip install ollama-chat\n ~~~\n\n\n### Updating\n\nTo update Ollama Chat:\n\n~~~\npip install -U ollama-chat\n~~~\n\n\n## Start Ollama Chat\n\nTo start Ollama Chat, open a terminal prompt and run the Ollama Chat application:\n\n~~~\nollama-chat\n~~~\n\nA web browser is launched and opens the Ollama Chat web application.\n\nBy default, a configuration file, \"ollama-chat.json\", is created in the user's home directory.\n\n\n## Start Conversation from CLI\n\nTo start a conversation from the command line, use the `-m` argument:\n\n~~~\nollama-chat -m \"Why is the sky blue?\"\n~~~\n\n\n## File Format and API Documentation\n\n[Ollama Chat File Format](https://craigahobbs.github.io/ollama-chat/api.html#var.vName='OllamaChatConfig')\n\n[Ollama Chat API](https://craigahobbs.github.io/ollama-chat/api.html)\n\n\n## Development\n\nThis package is developed using [python-build](https://github.com/craigahobbs/python-build#readme).\nIt was started using [python-template](https://github.com/craigahobbs/python-template#readme) as follows:\n\n~~~\ntemplate-specialize python-template/template/ ollama-chat/ -k package ollama-chat -k name 'Craig A. Hobbs' -k email 'craigahobbs@gmail.com' -k github 'craigahobbs' -k noapi 1\n~~~\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An Ollama chat web application",
"version": "0.9.18",
"project_urls": {
"Homepage": "https://github.com/craigahobbs/ollama-chat"
},
"split_keywords": [
"ollama",
" chatbot",
" conversational ai",
" artificial intelligence",
" ai"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "ab5fc1da90e058090788c09a8d53173dc5203d0a810805d72af1897b65931d1e",
"md5": "35b7d04b271d5c52c3facf838d16a6b5",
"sha256": "40ac43917402b530027e197e0b8e8b188868f713becf1ca44662d2773745f578"
},
"downloads": -1,
"filename": "ollama_chat-0.9.18-py3-none-any.whl",
"has_sig": false,
"md5_digest": "35b7d04b271d5c52c3facf838d16a6b5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 14928,
"upload_time": "2024-06-11T21:33:02",
"upload_time_iso_8601": "2024-06-11T21:33:02.955311Z",
"url": "https://files.pythonhosted.org/packages/ab/5f/c1da90e058090788c09a8d53173dc5203d0a810805d72af1897b65931d1e/ollama_chat-0.9.18-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ba6c7c1d2f0fc732b94d9182e4abb1de1ba7b210c2c539ad10be47a4b885b9d1",
"md5": "7b3ef22becc3e6ce10ccdf23335e621f",
"sha256": "e33fdbae250c9ba467bacb60bba8243492900c420287383bd949214cb3a58b24"
},
"downloads": -1,
"filename": "ollama_chat-0.9.18.tar.gz",
"has_sig": false,
"md5_digest": "7b3ef22becc3e6ce10ccdf23335e621f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 14163,
"upload_time": "2024-06-11T21:33:04",
"upload_time_iso_8601": "2024-06-11T21:33:04.494267Z",
"url": "https://files.pythonhosted.org/packages/ba/6c/7c1d2f0fc732b94d9182e4abb1de1ba7b210c2c539ad10be47a4b885b9d1/ollama_chat-0.9.18.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-06-11 21:33:04",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "craigahobbs",
"github_project": "ollama-chat",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ollama-chat"
}