Name | ollama-code-cli JSON |
Version |
1.0.1
JSON |
| download |
home_page | None |
Summary | A CLI tool for coding tasks using local LLMs with tool calling |
upload_time | 2025-08-31 06:51:36 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.10 |
license | None |
keywords |
cli
ollama
llm
ai
coding
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Ollama Code CLI
A beautiful, interactive command-line interface tool for coding tasks using local LLMs via Ollama with tool calling capabilities.
## Features
- 🎨 **Beautiful CLI Interface** - Rich colors and structured output
- 🤖 **Local AI Power** - Interact with local LLMs through Ollama
- 🛠️ **Tool Calling** - Execute coding-related tools (file operations, code execution, etc.)
- 💬 **Interactive Mode** - Maintain conversation context for multi-turn interactions
- 📝 **Markdown Support** - Beautifully formatted responses with syntax highlighting
- 📋 **Structured Output** - Clear panels and tables for tool calls and results
## Installation
```bash
pip install ollama-code-cli
```
## Usage
```bash
# Start an interactive session
ollama-code-cli
# Run a single command
ollama-code-cli "Create a Python function to calculate factorial"
# Use a specific model
ollama-code-cli --model llama3.1 "Explain how async/await works in Python"
```
## Available Tools
- `read_file`: Read the contents of a file
- `write_file`: Write content to a file
- `execute_code`: Execute code in a subprocess
- `list_files`: List files in a directory
- `run_command`: Run a shell command
## Examples
1. Create a Python script and save it to a file:
```bash
ollama-code-cli "Create a Python script that calculates factorial and save it to a file named factorial.py"
```
2. Read a file and explain its contents:
```bash
ollama-code-cli "Read the contents of main.py and explain what it does"
```
3. Execute a shell command:
```bash
ollama-code-cli "List all files in the current directory"
```
## Interactive Mode
Launch the interactive mode for a conversational experience:
```bash
ollama-code-cli
```
In interactive mode, you can:
- Have multi-turn conversations with the AI
- See beautiful formatted responses with Markdown support
- Watch tool calls and results in real-time with visual panels
- Clear conversation history with the `clear` command
- Exit gracefully with the `exit` command
## Project Structure
```
ollamacode/
├── ollamacode/
│ ├── __init__.py
│ ├── cli/
│ │ ├── __init__.py
│ │ └── cli.py # Main CLI interface
│ ├── tools/
│ │ ├── __init__.py
│ │ └── tool_manager.py # Tool implementations
├── pyproject.toml # Project configuration
└── README.md
```
## Installation
First, install a compatible model in Ollama:
```bash
# Choose one of these models:
ollama pull qwen3
ollama pull llama3.1
```
Then install the CLI:
```bash
pip install ollama-code-cli
```
## Requirements
- Python 3.13+
- Ollama installed and running
- An Ollama model that supports tool calling (e.g., Qwen3, Llama3.1+)
## Dependencies
- [Rich](https://github.com/Textualize/rich) - For beautiful terminal formatting
- [Click](https://click.palletsprojects.com/) - For command-line interface
- [Ollama Python Client](https://github.com/ollama/ollama-python) - For Ollama integration
Raw data
{
"_id": null,
"home_page": null,
"name": "ollama-code-cli",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "cli, ollama, llm, ai, coding",
"author": null,
"author_email": "Vigyat Goel <vigyatgoel@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/70/d4/89cd5fc378da7c220d8845c2fceb47387f4b02890ed82249db93921897ad/ollama_code_cli-1.0.1.tar.gz",
"platform": null,
"description": "# Ollama Code CLI\n\nA beautiful, interactive command-line interface tool for coding tasks using local LLMs via Ollama with tool calling capabilities.\n\n## Features\n\n- \ud83c\udfa8 **Beautiful CLI Interface** - Rich colors and structured output\n- \ud83e\udd16 **Local AI Power** - Interact with local LLMs through Ollama\n- \ud83d\udee0\ufe0f **Tool Calling** - Execute coding-related tools (file operations, code execution, etc.)\n- \ud83d\udcac **Interactive Mode** - Maintain conversation context for multi-turn interactions\n- \ud83d\udcdd **Markdown Support** - Beautifully formatted responses with syntax highlighting\n- \ud83d\udccb **Structured Output** - Clear panels and tables for tool calls and results\n\n## Installation\n\n```bash\npip install ollama-code-cli\n```\n\n## Usage\n\n```bash\n# Start an interactive session\nollama-code-cli\n\n# Run a single command\nollama-code-cli \"Create a Python function to calculate factorial\"\n\n# Use a specific model\nollama-code-cli --model llama3.1 \"Explain how async/await works in Python\"\n```\n\n## Available Tools\n\n- `read_file`: Read the contents of a file\n- `write_file`: Write content to a file\n- `execute_code`: Execute code in a subprocess\n- `list_files`: List files in a directory\n- `run_command`: Run a shell command\n\n## Examples\n\n1. Create a Python script and save it to a file:\n ```bash\n ollama-code-cli \"Create a Python script that calculates factorial and save it to a file named factorial.py\"\n ```\n\n2. Read a file and explain its contents:\n ```bash\n ollama-code-cli \"Read the contents of main.py and explain what it does\"\n ```\n\n3. Execute a shell command:\n ```bash\n ollama-code-cli \"List all files in the current directory\"\n ```\n\n## Interactive Mode\n\nLaunch the interactive mode for a conversational experience:\n\n```bash\nollama-code-cli\n```\n\nIn interactive mode, you can:\n- Have multi-turn conversations with the AI\n- See beautiful formatted responses with Markdown support\n- Watch tool calls and results in real-time with visual panels\n- Clear conversation history with the `clear` command\n- Exit gracefully with the `exit` command\n\n## Project Structure\n\n```\nollamacode/\n\u251c\u2500\u2500 ollamacode/\n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 cli/\n\u2502 \u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2502 \u2514\u2500\u2500 cli.py # Main CLI interface\n\u2502 \u251c\u2500\u2500 tools/\n\u2502 \u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u2502 \u2514\u2500\u2500 tool_manager.py # Tool implementations\n\u251c\u2500\u2500 pyproject.toml # Project configuration\n\u2514\u2500\u2500 README.md\n```\n\n## Installation\n\nFirst, install a compatible model in Ollama:\n```bash\n# Choose one of these models:\nollama pull qwen3\nollama pull llama3.1\n```\n\nThen install the CLI:\n```bash\npip install ollama-code-cli\n```\n\n## Requirements\n\n- Python 3.13+\n- Ollama installed and running\n- An Ollama model that supports tool calling (e.g., Qwen3, Llama3.1+)\n\n## Dependencies\n\n- [Rich](https://github.com/Textualize/rich) - For beautiful terminal formatting\n- [Click](https://click.palletsprojects.com/) - For command-line interface\n- [Ollama Python Client](https://github.com/ollama/ollama-python) - For Ollama integration\n",
"bugtrack_url": null,
"license": null,
"summary": "A CLI tool for coding tasks using local LLMs with tool calling",
"version": "1.0.1",
"project_urls": {
"Homepage": "https://github.com/vigyatgoel/ollama-code-cli",
"Issues": "https://github.com/vigyatgoel/ollama-code-cli/issues",
"Repository": "https://github.com/vigyatgoel/ollama-code-cli"
},
"split_keywords": [
"cli",
" ollama",
" llm",
" ai",
" coding"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "63d7dc38d09200154a6ad386b55fd6305f7e7b57e790e8f381e1d56eddd2ce4d",
"md5": "f8461e24ad803f890169d0ff590df171",
"sha256": "ef888b2d1ede0eb5476753e4beed02a59ed3e5712b48107949cef5e94216565e"
},
"downloads": -1,
"filename": "ollama_code_cli-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "f8461e24ad803f890169d0ff590df171",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 10142,
"upload_time": "2025-08-31T06:51:34",
"upload_time_iso_8601": "2025-08-31T06:51:34.869650Z",
"url": "https://files.pythonhosted.org/packages/63/d7/dc38d09200154a6ad386b55fd6305f7e7b57e790e8f381e1d56eddd2ce4d/ollama_code_cli-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "70d489cd5fc378da7c220d8845c2fceb47387f4b02890ed82249db93921897ad",
"md5": "46ebbd8d28ead075295cb12e0be9ac05",
"sha256": "8a8c4b1355b52d60663a4356507bcfa25f9c8a6202577c42bb366e0d2c3102a2"
},
"downloads": -1,
"filename": "ollama_code_cli-1.0.1.tar.gz",
"has_sig": false,
"md5_digest": "46ebbd8d28ead075295cb12e0be9ac05",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 10063,
"upload_time": "2025-08-31T06:51:36",
"upload_time_iso_8601": "2025-08-31T06:51:36.406211Z",
"url": "https://files.pythonhosted.org/packages/70/d4/89cd5fc378da7c220d8845c2fceb47387f4b02890ed82249db93921897ad/ollama_code_cli-1.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-08-31 06:51:36",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "vigyatgoel",
"github_project": "ollama-code-cli",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "ollama-code-cli"
}