# 🦙 Ollama Manager
![Python Version](https://img.shields.io/badge/Python-3.11-brightgreen?style=flat-square)
![Downloads](https://static.pepy.tech/badge/ollama-manager)
CLI app to manage Ollama models.
<a href="https://youtu.be/1y2TohQdNbo">
<img src="https://i.imgur.com/iA0LB0e.gif" width="800">
</a>
## ⚡️ Features
- List and Download Remote models from [🦙 Ollama library](https://ollama.dev/models) or [🤗 Hugging Face](https://huggingface.co/models?sort=trending&search=gguf)
- Delete existing Ollama models
- Launch models in Streamlit UI
- Fuzzy Search
## 🚀 Installation
```sh
pip install ollama-manager
# OR
pipx install ollama-manager
```
For development: installs app in editable mode
```sh
make setup
```
## ✨ Usage
### Pull Remote Model
```sh
olm pull
```
Pull Hugging Face models:
```sh
olm pull -hf
# With query:
olm pull -hf -q llama3.2
# With limit:
olm pull -hf -q llama3.2 -l 10
```
### Delete Local Model/s
Delete a single model
```sh
olm rm
```
Delete multiple models
```sh
olm rm -m
```
Delete model without confirmation prompt:
```sh
olm rm -y
```
### Run selected model
Run the selected model on Ollama terminal UI:
```sh
olm run
```
---
**Run models in a Streamlit UI:**
<details>
<summary>Ollama Manager UI</summary>
<img src="https://i.imgur.com/UqQLjXx.gif" width="800" />
</details>
You need to install optional dependencies for this:
```sh
pip install ollama-manager[ui]
```
then use the following command to select the model:
```sh
# For Text Models
olm run -ui text
# For Vision Models
olm run -ui vision
```
## Getting Help
```sh
olm --help
olm <sub-command> --help
```
Raw data
{
"_id": null,
"home_page": null,
"name": "ollama-manager",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "ollama, ollama cli, ollama python, ollama manager",
"author": "Yankee Maharjan",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/25/11/236a986428a55a49e8a4b21f49b8b4f70ab289fa1db4749c152b1d2bbe55/ollama_manager-0.0.14.tar.gz",
"platform": null,
"description": "# \ud83e\udd99 Ollama Manager\n\n![Python Version](https://img.shields.io/badge/Python-3.11-brightgreen?style=flat-square)\n![Downloads](https://static.pepy.tech/badge/ollama-manager)\n\n\nCLI app to manage Ollama models.\n\n<a href=\"https://youtu.be/1y2TohQdNbo\">\n<img src=\"https://i.imgur.com/iA0LB0e.gif\" width=\"800\">\n</a>\n\n## \u26a1\ufe0f Features\n\n- List and Download Remote models from [\ud83e\udd99 Ollama library](https://ollama.dev/models) or [\ud83e\udd17 Hugging Face](https://huggingface.co/models?sort=trending&search=gguf)\n- Delete existing Ollama models\n- Launch models in Streamlit UI\n- Fuzzy Search\n\n\n## \ud83d\ude80 Installation\n\n```sh\npip install ollama-manager\n\n# OR\n\npipx install ollama-manager\n```\n\nFor development: installs app in editable mode\n\n```sh\nmake setup\n```\n\n## \u2728 Usage\n\n### Pull Remote Model\n\n```sh\nolm pull\n```\n\nPull Hugging Face models:\n\n```sh\nolm pull -hf\n\n# With query:\n\nolm pull -hf -q llama3.2\n\n# With limit:\n\nolm pull -hf -q llama3.2 -l 10\n```\n\n### Delete Local Model/s\n\nDelete a single model\n\n```sh\nolm rm\n```\n\nDelete multiple models\n\n```sh\nolm rm -m\n```\n\nDelete model without confirmation prompt:\n\n```sh\nolm rm -y\n```\n\n### Run selected model\n\nRun the selected model on Ollama terminal UI:\n\n```sh\nolm run\n```\n\n---\n\n**Run models in a Streamlit UI:**\n\n<details>\n<summary>Ollama Manager UI</summary>\n<img src=\"https://i.imgur.com/UqQLjXx.gif\" width=\"800\" />\n</details>\n\nYou need to install optional dependencies for this:\n\n```sh\npip install ollama-manager[ui]\n```\n\nthen use the following command to select the model:\n\n```sh\n# For Text Models\nolm run -ui text\n\n# For Vision Models\n\nolm run -ui vision\n```\n\n## Getting Help\n\n```sh\nolm --help\n\nolm <sub-command> --help\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "Manage Ollama models from your commandline!",
"version": "0.0.14",
"project_urls": {
"Repository": "https://github.com/yankeexe/ollama-manager"
},
"split_keywords": [
"ollama",
" ollama cli",
" ollama python",
" ollama manager"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "145bb6479307371cd1ac766b442d808bcdfbe44e5532329279f9ae5edf863fd6",
"md5": "9051f2457fb279656d7045d6ce60de4f",
"sha256": "9d268303ba2e4401b363bb370beafbabecd63906330e04a8d008e43972f1cb55"
},
"downloads": -1,
"filename": "ollama_manager-0.0.14-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9051f2457fb279656d7045d6ce60de4f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 12552,
"upload_time": "2025-01-30T01:29:20",
"upload_time_iso_8601": "2025-01-30T01:29:20.106772Z",
"url": "https://files.pythonhosted.org/packages/14/5b/b6479307371cd1ac766b442d808bcdfbe44e5532329279f9ae5edf863fd6/ollama_manager-0.0.14-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "2511236a986428a55a49e8a4b21f49b8b4f70ab289fa1db4749c152b1d2bbe55",
"md5": "dcf841e55a1952ea7b414de053a69075",
"sha256": "a03edf31fb056c8534deefb3d0e62b37e62a1c76c410e9f5b993ee190ebdedcd"
},
"downloads": -1,
"filename": "ollama_manager-0.0.14.tar.gz",
"has_sig": false,
"md5_digest": "dcf841e55a1952ea7b414de053a69075",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 10582,
"upload_time": "2025-01-30T01:29:21",
"upload_time_iso_8601": "2025-01-30T01:29:21.305823Z",
"url": "https://files.pythonhosted.org/packages/25/11/236a986428a55a49e8a4b21f49b8b4f70ab289fa1db4749c152b1d2bbe55/ollama_manager-0.0.14.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-30 01:29:21",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yankeexe",
"github_project": "ollama-manager",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "ollama-manager"
}