promptguru


Namepromptguru JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryModular prompt engineering library
upload_time2025-08-13 22:52:51
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseApache-2.0
keywords prompts nlp llm huggingface templates
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PromptGuru

**Modular prompt engineering library** for BERT, Mistral, LLaMA, and FLAN-T5 using YAML templates.
Includes modes like **ELI5**, **DevMode**, **Refine**, **Classification**, and **QA**.

## Why
- Lightweight and framework-agnostic
- YAML-first: edit prompts without changing code
- Consistent modes across multiple model families

## Install (Local Dev)
```bash
pip install PyYAML
```
> For now, clone or copy this repo. PyPI packaging steps are included below.

## Usage
```python
from promptguru.engine import PromptEngine

engine = PromptEngine(model_type="mistral", mode="eli5")
prompt = engine.generate_prompt("What is quantum entanglement?")
print(prompt)
```

## Templates
Templates live in `promptguru/templates/`:
- `bert.yaml` → `classification`, `fill_mask`, `qa`
- `mistral.yaml` → `eli5`, `devmode`, `refine`
- `llama.yaml` → `eli5`, `devmode`, `refine`
- `flan_t5.yaml` → `eli5`, `devmode`, `explain_and_tag`

## Roadmap
- Add inference adapters (HF Inference API, OpenRouter) behind a common interface
- Add more modes (contrastive QA, chain-of-thought, safety/risk tags)

## License
Apache 2.0

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "promptguru",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "prompts, nlp, llm, huggingface, templates",
    "author": null,
    "author_email": "Naga Adithya Kaushik <adithyakaushikch@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/28/67/3f1b9928336a802ed14085d6b8508bad687b983f0192a34a86fe6a950674/promptguru-0.1.0.tar.gz",
    "platform": null,
    "description": "# PromptGuru\n\n**Modular prompt engineering library** for BERT, Mistral, LLaMA, and FLAN-T5 using YAML templates.\nIncludes modes like **ELI5**, **DevMode**, **Refine**, **Classification**, and **QA**.\n\n## Why\n- Lightweight and framework-agnostic\n- YAML-first: edit prompts without changing code\n- Consistent modes across multiple model families\n\n## Install (Local Dev)\n```bash\npip install PyYAML\n```\n> For now, clone or copy this repo. PyPI packaging steps are included below.\n\n## Usage\n```python\nfrom promptguru.engine import PromptEngine\n\nengine = PromptEngine(model_type=\"mistral\", mode=\"eli5\")\nprompt = engine.generate_prompt(\"What is quantum entanglement?\")\nprint(prompt)\n```\n\n## Templates\nTemplates live in `promptguru/templates/`:\n- `bert.yaml` \u2192 `classification`, `fill_mask`, `qa`\n- `mistral.yaml` \u2192 `eli5`, `devmode`, `refine`\n- `llama.yaml` \u2192 `eli5`, `devmode`, `refine`\n- `flan_t5.yaml` \u2192 `eli5`, `devmode`, `explain_and_tag`\n\n## Roadmap\n- Add inference adapters (HF Inference API, OpenRouter) behind a common interface\n- Add more modes (contrastive QA, chain-of-thought, safety/risk tags)\n\n## License\nApache 2.0\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Modular prompt engineering library",
    "version": "0.1.0",
    "project_urls": {
        "Homepage": "https://huggingface.co/spaces/GenAIDevTOProd/PromptGuru"
    },
    "split_keywords": [
        "prompts",
        " nlp",
        " llm",
        " huggingface",
        " templates"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "5e1df2513e0a6f18bba91526eb031804cf515a431d0e2f31ca67758fdb5089a1",
                "md5": "5404933b1544bedd20c95947b34fff69",
                "sha256": "78a3497873143f1e00309033bf58e8514b1900f90a4a17c95480979c7b6e644e"
            },
            "downloads": -1,
            "filename": "promptguru-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5404933b1544bedd20c95947b34fff69",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 4367,
            "upload_time": "2025-08-13T22:52:49",
            "upload_time_iso_8601": "2025-08-13T22:52:49.796117Z",
            "url": "https://files.pythonhosted.org/packages/5e/1d/f2513e0a6f18bba91526eb031804cf515a431d0e2f31ca67758fdb5089a1/promptguru-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "28673f1b9928336a802ed14085d6b8508bad687b983f0192a34a86fe6a950674",
                "md5": "02d6bf24333d1f4a517093e118c4a54d",
                "sha256": "bfc384c86f75599ca6f98b97a2c680621eae15f30bf7f07d38eeab049aa20c55"
            },
            "downloads": -1,
            "filename": "promptguru-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "02d6bf24333d1f4a517093e118c4a54d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 3309,
            "upload_time": "2025-08-13T22:52:51",
            "upload_time_iso_8601": "2025-08-13T22:52:51.047484Z",
            "url": "https://files.pythonhosted.org/packages/28/67/3f1b9928336a802ed14085d6b8508bad687b983f0192a34a86fe6a950674/promptguru-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-13 22:52:51",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "promptguru"
}
        
Elapsed time: 1.00381s