Name | gerg JSON |
Version |
0.0.1
JSON |
| download |
home_page | None |
Summary | CLI agent that collaborates with you by planning and running shell commands via a local Ollama model |
upload_time | 2025-10-06 22:36:06 |
maintainer | None |
docs_url | None |
author | Greg Gottlieb |
requires_python | >=3.9 |
license | MIT |
keywords |
ollama
cli
agent
shell
automation
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
**gerg** is a command‑line agent that uses your local Ollama model to plan shell commands for a natural‑language task. It can either output the command to help, or execute it for you.
> Safety-first: by default gerg *asks before running anything*. Use `--yes` to auto‑run, and see `--allow-unsafe` to permit risky commands.
## Install
```bash
pip install gerg
```
## Quick start
1) Make sure Ollama is running locally (default `http://127.0.0.1:11434`).
2) Run gerg:
```bash
gerg "list all files in my Downloads directory"
```
3) Approve the plan or run automatically:
```bash
gerg -y "compress the Downloads folder into downloads.zip"
```
4) Need multi-step planning and commands. Enable the --think flag for Chain of Thought planning and execution.
5) Change default model with:
```bash
export GERG_MODEL="DESIRED MODEL"
```
## Examples
```bash
# Only print commands (never execute)
gerg --print "find the 5 largest files under ~/Downloads"
# Use a different model just for this run
gerg -m llama3:8b "init a git repo, make first commit"
# Work from another directory (without cd'ing first)
gerg --cwd ~/Projects/website "build the site and serve locally"
gerg --think "create a .txt file in my Documents folder with a simple rhyme"
```
Raw data
{
"_id": null,
"home_page": null,
"name": "gerg",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "ollama, cli, agent, shell, automation",
"author": "Greg Gottlieb",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/75/fb/7258a827b86f78d6b421522898583f8efc4a31b0978d6b62ddd1a1eae448/gerg-0.0.1.tar.gz",
"platform": null,
"description": "**gerg** is a command\u2011line agent that uses your local Ollama model to plan shell commands for a natural\u2011language task. It can either output the command to help, or execute it for you.\n\n\n> Safety-first: by default gerg *asks before running anything*. Use `--yes` to auto\u2011run, and see `--allow-unsafe` to permit risky commands.\n\n\n## Install\n\n\n```bash\npip install gerg\n```\n\n\n## Quick start\n\n\n1) Make sure Ollama is running locally (default `http://127.0.0.1:11434`).\n2) Run gerg:\n```bash\ngerg \"list all files in my Downloads directory\"\n```\n3) Approve the plan or run automatically:\n```bash\ngerg -y \"compress the Downloads folder into downloads.zip\"\n```\n4) Need multi-step planning and commands. Enable the --think flag for Chain of Thought planning and execution.\n5) Change default model with:\n```bash\nexport GERG_MODEL=\"DESIRED MODEL\"\n```\n\n## Examples\n\n\n```bash\n# Only print commands (never execute)\ngerg --print \"find the 5 largest files under ~/Downloads\"\n\n\n# Use a different model just for this run\ngerg -m llama3:8b \"init a git repo, make first commit\"\n\n\n# Work from another directory (without cd'ing first)\ngerg --cwd ~/Projects/website \"build the site and serve locally\"\n\ngerg --think \"create a .txt file in my Documents folder with a simple rhyme\"\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "CLI agent that collaborates with you by planning and running shell commands via a local Ollama model",
"version": "0.0.1",
"project_urls": {
"Homepage": "https://github.com/ggottli/gerg"
},
"split_keywords": [
"ollama",
" cli",
" agent",
" shell",
" automation"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "398d6c57524ef2b1f607de7d3840256bf91c01bc3f4d0c99445d007ca230ddef",
"md5": "96a1f03fabbc4652ecd660667aa3967f",
"sha256": "680b8653d6af115490ba2fa7ac403fed8ae324aa32e6a83e5c4f1377fce78785"
},
"downloads": -1,
"filename": "gerg-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "96a1f03fabbc4652ecd660667aa3967f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 11692,
"upload_time": "2025-10-06T22:36:05",
"upload_time_iso_8601": "2025-10-06T22:36:05.194905Z",
"url": "https://files.pythonhosted.org/packages/39/8d/6c57524ef2b1f607de7d3840256bf91c01bc3f4d0c99445d007ca230ddef/gerg-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "75fb7258a827b86f78d6b421522898583f8efc4a31b0978d6b62ddd1a1eae448",
"md5": "11464b8298e9b00df9ca6a749d3c701b",
"sha256": "acc72d18a7830da545ff34c8725e4e580b6bdb3230510d695b3bdbaded4c590a"
},
"downloads": -1,
"filename": "gerg-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "11464b8298e9b00df9ca6a749d3c701b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 10999,
"upload_time": "2025-10-06T22:36:06",
"upload_time_iso_8601": "2025-10-06T22:36:06.053317Z",
"url": "https://files.pythonhosted.org/packages/75/fb/7258a827b86f78d6b421522898583f8efc4a31b0978d6b62ddd1a1eae448/gerg-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-06 22:36:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ggottli",
"github_project": "gerg",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "gerg"
}