prompt-scribe


Nameprompt-scribe JSON
Version 0.1.0 PyPI version JSON
download
home_pageNone
SummaryA powerful, template-based prompt composer for crafting and managing complex instructions for LLMs.
upload_time2025-10-29 04:44:22
maintainerNone
docs_urlNone
authorPlyanter Viktor
requires_python>=3.14
licenseNone
keywords llm ai prompt-engineering developer-tools jinja2 cli
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <table>
  <tr>
    <td align="center" style="border: none; padding: 0;">
      <img src="https://raw.githubusercontent.com/scribe-works/prompt-scribe/main/docs/assets/logo.png" alt="The Scribe Works Logo" width="200"/>
    </td>
    <td style="border: none; padding: 0; vertical-align: middle;">
      <h1>Prompt Scribe</h1>
      <p>A powerful, template-based prompt composer for crafting and managing complex instructions for LLMs.</p>
    </td>
  </tr>
</table>

---

## The Problem

Managing large, multi-part prompts for Large Language Models can be messy. You often combine a persona, rules, context from different files, and a specific query. Doing this manually by copy-pasting is tedious and error-prone.

## The Solution

**Prompt Scribe** automates this process. It uses a simple YAML configuration and the powerful Jinja2 templating engine to compose your final prompts from various source files, ready to be used in any LLM chat or API.

## Key Features

-   **Template-Based:** Use Jinja2 templates for maximum flexibility.
-   **Modular:** Split your prompts into reusable parts (personas, includes, rules).
-   **Safe Initialization:** Creates a dedicated `.prompt_scribe/` directory to avoid cluttering your project root.
-   **Watch Mode:** Automatically recompose prompts when source files change.
-   **Beautiful CLI:** A clean, helpful command-line interface.

## Installation

```bash
pip install prompt-scribe
```

## Quick Start

1.  **Initialize a new project:**
    Navigate to your project directory and run:
    ```bash
    prompt-scribe init
    ```
    This will create a self-contained `.prompt_scribe/` directory with a default structure:
    ```
    .
    └── .prompt_scribe/
        ├── composed_prompts/
        ├── includes/
        │   └── development-rules.md
        ├── personas/
        │   └── code-reviewer.md
        ├── templates/
        │   └── master.jinja2
        └── prompts.yml
    ```

2.  **Compose your prompts:**
    Run the compose command from your project root:
    ```bash
    prompt-scribe compose
    ```
    This will read `.prompt_scribe/prompts.yml`, process the `example-code-reviewer` agent, and generate the final prompt in `.prompt_scribe/composed_prompts/`.

3.  **Use the output:**
    Open the generated file, copy the content, and paste it into your LLM chat interface.

## Usage

### Composing Prompts

-   **Compose all agents**:
    ```bash
    prompt-scribe compose
    ```
-   **Compose specific agents**:
    ```bash
    prompt-scribe compose agent-one agent-two
    ```
-   **Watch for changes** and automatically recompose:
    ```bash
    prompt-scribe compose --watch
    ```

## Configuration (`.prompt_scribe/prompts.yml`)

The `prompts.yml` file is the heart of your project. It resolves all paths relative to its own location.

```yaml
# Global settings for all agents
settings:
  personas_dir: "personas"
  includes_dir: "includes"
  templates_dir: "templates"
  output_dir: "composed_prompts"

# Map of agents to be composed
agents:
  example-code-reviewer:
    # Jinja2 template to use for rendering
    template: master.jinja2
    # The final output file name
    output_file: example_code_reviewer.md
    # The persona section, available as `persona` in the template
    persona:
      file: personas/code-reviewer.md
    # A list of content sections, available as `sections` in the template
    sections:
      - title: "📜 Key Development Rules"
        prologue: "These are the mandatory rules and principles for this project."
        file: includes/development-rules.md
      - title: "📄 Code Snippet for Review"
        # You can also include content directly
        content: |
          ```python
          def hello_world():
              print("Hello, Scribe!")
          ```
```

## License

This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "prompt-scribe",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.14",
    "maintainer_email": null,
    "keywords": "llm, ai, prompt-engineering, developer-tools, jinja2, cli",
    "author": "Plyanter Viktor",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/f2/90/d53909b54911fc259974cd4bb2d375ea4468c1e7170653f4885f8484adb2/prompt_scribe-0.1.0.tar.gz",
    "platform": null,
    "description": "<table>\n  <tr>\n    <td align=\"center\" style=\"border: none; padding: 0;\">\n      <img src=\"https://raw.githubusercontent.com/scribe-works/prompt-scribe/main/docs/assets/logo.png\" alt=\"The Scribe Works Logo\" width=\"200\"/>\n    </td>\n    <td style=\"border: none; padding: 0; vertical-align: middle;\">\n      <h1>Prompt Scribe</h1>\n      <p>A powerful, template-based prompt composer for crafting and managing complex instructions for LLMs.</p>\n    </td>\n  </tr>\n</table>\n\n---\n\n## The Problem\n\nManaging large, multi-part prompts for Large Language Models can be messy. You often combine a persona, rules, context from different files, and a specific query. Doing this manually by copy-pasting is tedious and error-prone.\n\n## The Solution\n\n**Prompt Scribe** automates this process. It uses a simple YAML configuration and the powerful Jinja2 templating engine to compose your final prompts from various source files, ready to be used in any LLM chat or API.\n\n## Key Features\n\n-   **Template-Based:** Use Jinja2 templates for maximum flexibility.\n-   **Modular:** Split your prompts into reusable parts (personas, includes, rules).\n-   **Safe Initialization:** Creates a dedicated `.prompt_scribe/` directory to avoid cluttering your project root.\n-   **Watch Mode:** Automatically recompose prompts when source files change.\n-   **Beautiful CLI:** A clean, helpful command-line interface.\n\n## Installation\n\n```bash\npip install prompt-scribe\n```\n\n## Quick Start\n\n1.  **Initialize a new project:**\n    Navigate to your project directory and run:\n    ```bash\n    prompt-scribe init\n    ```\n    This will create a self-contained `.prompt_scribe/` directory with a default structure:\n    ```\n    .\n    \u2514\u2500\u2500 .prompt_scribe/\n        \u251c\u2500\u2500 composed_prompts/\n        \u251c\u2500\u2500 includes/\n        \u2502   \u2514\u2500\u2500 development-rules.md\n        \u251c\u2500\u2500 personas/\n        \u2502   \u2514\u2500\u2500 code-reviewer.md\n        \u251c\u2500\u2500 templates/\n        \u2502   \u2514\u2500\u2500 master.jinja2\n        \u2514\u2500\u2500 prompts.yml\n    ```\n\n2.  **Compose your prompts:**\n    Run the compose command from your project root:\n    ```bash\n    prompt-scribe compose\n    ```\n    This will read `.prompt_scribe/prompts.yml`, process the `example-code-reviewer` agent, and generate the final prompt in `.prompt_scribe/composed_prompts/`.\n\n3.  **Use the output:**\n    Open the generated file, copy the content, and paste it into your LLM chat interface.\n\n## Usage\n\n### Composing Prompts\n\n-   **Compose all agents**:\n    ```bash\n    prompt-scribe compose\n    ```\n-   **Compose specific agents**:\n    ```bash\n    prompt-scribe compose agent-one agent-two\n    ```\n-   **Watch for changes** and automatically recompose:\n    ```bash\n    prompt-scribe compose --watch\n    ```\n\n## Configuration (`.prompt_scribe/prompts.yml`)\n\nThe `prompts.yml` file is the heart of your project. It resolves all paths relative to its own location.\n\n```yaml\n# Global settings for all agents\nsettings:\n  personas_dir: \"personas\"\n  includes_dir: \"includes\"\n  templates_dir: \"templates\"\n  output_dir: \"composed_prompts\"\n\n# Map of agents to be composed\nagents:\n  example-code-reviewer:\n    # Jinja2 template to use for rendering\n    template: master.jinja2\n    # The final output file name\n    output_file: example_code_reviewer.md\n    # The persona section, available as `persona` in the template\n    persona:\n      file: personas/code-reviewer.md\n    # A list of content sections, available as `sections` in the template\n    sections:\n      - title: \"\ud83d\udcdc Key Development Rules\"\n        prologue: \"These are the mandatory rules and principles for this project.\"\n        file: includes/development-rules.md\n      - title: \"\ud83d\udcc4 Code Snippet for Review\"\n        # You can also include content directly\n        content: |\n          ```python\n          def hello_world():\n              print(\"Hello, Scribe!\")\n          ```\n```\n\n## License\n\nThis project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.",
    "bugtrack_url": null,
    "license": null,
    "summary": "A powerful, template-based prompt composer for crafting and managing complex instructions for LLMs.",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [
        "llm",
        " ai",
        " prompt-engineering",
        " developer-tools",
        " jinja2",
        " cli"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "603da1458be887dae7954ef0b20c68fc0c3ad6e716645e77585c86b4183d1fdf",
                "md5": "fd5b206b8c30f8a0dbcabd7eb935efaf",
                "sha256": "813bba9cc850b8869a236badce099503aed3e480dea82e18768cf28e52164b3b"
            },
            "downloads": -1,
            "filename": "prompt_scribe-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fd5b206b8c30f8a0dbcabd7eb935efaf",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.14",
            "size": 12701,
            "upload_time": "2025-10-29T04:44:20",
            "upload_time_iso_8601": "2025-10-29T04:44:20.699643Z",
            "url": "https://files.pythonhosted.org/packages/60/3d/a1458be887dae7954ef0b20c68fc0c3ad6e716645e77585c86b4183d1fdf/prompt_scribe-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f290d53909b54911fc259974cd4bb2d375ea4468c1e7170653f4885f8484adb2",
                "md5": "f18153e92965eeb85511b3e2429b2a02",
                "sha256": "1e8b4e0191e8ba638a3f547fc5e84079047d4f8f1c7c71ad00c3f4ef4d2c2839"
            },
            "downloads": -1,
            "filename": "prompt_scribe-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "f18153e92965eeb85511b3e2429b2a02",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.14",
            "size": 10684,
            "upload_time": "2025-10-29T04:44:22",
            "upload_time_iso_8601": "2025-10-29T04:44:22.151269Z",
            "url": "https://files.pythonhosted.org/packages/f2/90/d53909b54911fc259974cd4bb2d375ea4468c1e7170653f4885f8484adb2/prompt_scribe-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-29 04:44:22",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "prompt-scribe"
}
        
Elapsed time: 2.13876s