promptml-cli


Namepromptml-cli JSON
Version 1.0.1 PyPI version JSON
download
home_pageNone
SummaryA CLI tool to run PromptML scripts
upload_time2024-07-23 04:59:27
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords artificial-intelligence dsl generative-ai language prompt-engineering promptml promptml-cli
VCS
bugtrack_url
requirements promptml openai rich google-generativeai click
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # promptml-cli
A CLI application to run PromptML scripts against LLMs.

## Installation
```bash
pip install --upgrade promptml-cli
```

This installs a command called `procli`.

## Demo
[![asciicast](https://asciinema.org/a/664270.svg)](https://asciinema.org/a/664270)

## Usage

```bash
procli --help


Usage: procli [OPTIONS]

Options:
  -f, --file PATH                 Path to the PromptML(.pml) file  [required]
  -m, --model TEXT                Model to use for the completion
  -s, --serializer [xml|json|yaml]
                                  Serializer to use for the completion.
                                  Default is `xml`
  -p, --provider [openai|google|ollama]
                                  GenAI provider to use for the completion.
                                  Default is `openai`
  --no-stream                     Get whole GenAI response. Default is
                                  streaming response.
  --raw                           Return raw output from LLM (best for saving
                                  into files or piping)
  --help                          Show this message and exit.

For more details of composing PromptML files, visit: https://promptml.org/
```
## Example

1. Create a PromptML file `character.pml` with the following content:

```promptml
@prompt
    @context
        You are a millitary general in Roman army.
    @end

    @objective
        Describe a regular day in your life.
    @end

    @instructions
        @step
            Be cheeky and sarcastic.
        @end
    @end

    @category
        Simulation
    @end
@end
```
See PromptML [documentation](https://www.promptml.org/) for more details about the syntax.

2. Set your OpenAI API key as an environment variable:

```bash
export OPEN_AI_API_KEY=your-openai-api-key
```

or if you are using Google GenAI.

```bash
export GOOGLE_API_KEY=your-google-api-key
```
or if you are using Local Ollama GenAI, No API key is required.

3. Run the PromptML file with the following command in terminal:

```bash
procli -f character.pml -p ollama -m phi3
```

You will see the respective output on terminal:

```info
Ah, so you want a glimpse into the life of a Roman general, do you? Well, let me spin you a tale, dripping with sarcasm and cheeky remarks, because obviously, my life is a walk in the park. Shall we?

The day usually starts with the soothing sounds of soldiers clanging their swords and shields together at some ungodly hour. Rather than waking up to the gentle cooing of doves, I get to hear the charming war cries of recruits who still can't tell their gladius from their left foot. Delightful, isn't it?

After I drag myself out of what I'm convinced is a sack filled with rocks they call a bed, it's straight to the strategy tent. Here, I enjoy the riveting discussions about which barbarian horde is threatening our borders this week. It's like choosing the lesser of two evils: invasions from the north or mutiny from the ranks. Decisions, decisions!

Next on the agenda is overseeing training. Oh yes, I just love watching greenhorns stumble through basic drills. The way they handle their weapons – you'd think a lopsided stick had suddenly become the deadliest thing in their hands. But hey, a general's got to humor them, right?

Then there's the daily feast of dried meat and stale bread, washed down with wine that's likely been used as paint thinner. Ah, the joys of Roman culinary delights. I'm sure Bacchus himself is weeping with laughter somewhere.

Afternoons are reserved for dealing with the Senate's missives, those beautifully crafted scrolls filled with ‘helpful’ suggestions and veiled threats. It's like mail time with a hint of doomsday. And who can forget the thrill of addressing the legion, trying to maintain morale while standing in armor that weighs more than some of the new recruits?

As evening falls, I get to review the day's progress with my centurions, who conveniently bring me the freshest of problems right before bedtime. If I’m lucky, I'll dodge an assassination attempt or two – keeps life exciting, don’t you think?

Finally, I retire for the night, eager to wake up and do it all over again. Really, what's not to love? So there you have it! Just another average day in the life of a Roman general – a blend of strategy, sarcasm, and just a dash of masochism.

Time taken: 26.04 seconds
```

## Streaming & Non-streaming Responses

By default, procli streams the response from the GenAI model. You can force CLI tool to collect whole response by using the `--no-stream` flag.


```bash
# Streaming response
promptml-cli -f character.pml -p google
```

```bash
# Non-streaming response
promptml-cli -f character.pml -p google --no-stream
```

## Raw Output

You can also get the raw output from the GenAI model by using the `--raw` flag. This will return the raw output from the model without any formatting.

```bash
# Raw Markdown output
promptml-cli -f character.pml -p openai --raw
```

Note: Raw output is useful when you want to save the output to a file or pipe it to another command.

## TODO
- Add support for Claude, Cohere & A21 Labs GenAI models
- Add tests

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "promptml-cli",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "artificial-intelligence, dsl, generative-ai, language, prompt-engineering, promptml, promptml-cli",
    "author": null,
    "author_email": "\"Vidura Labs Inc.\" <contact@vidura.ai>, Naren Yellavula <naren.yellavula@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/1c/f5/6fe9c95557269d161d2c2e16c21722963ca33a014a0bb683760b9b3889d7/promptml_cli-1.0.1.tar.gz",
    "platform": null,
    "description": "# promptml-cli\nA CLI application to run PromptML scripts against LLMs.\n\n## Installation\n```bash\npip install --upgrade promptml-cli\n```\n\nThis installs a command called `procli`.\n\n## Demo\n[![asciicast](https://asciinema.org/a/664270.svg)](https://asciinema.org/a/664270)\n\n## Usage\n\n```bash\nprocli --help\n\n\nUsage: procli [OPTIONS]\n\nOptions:\n  -f, --file PATH                 Path to the PromptML(.pml) file  [required]\n  -m, --model TEXT                Model to use for the completion\n  -s, --serializer [xml|json|yaml]\n                                  Serializer to use for the completion.\n                                  Default is `xml`\n  -p, --provider [openai|google|ollama]\n                                  GenAI provider to use for the completion.\n                                  Default is `openai`\n  --no-stream                     Get whole GenAI response. Default is\n                                  streaming response.\n  --raw                           Return raw output from LLM (best for saving\n                                  into files or piping)\n  --help                          Show this message and exit.\n\nFor more details of composing PromptML files, visit: https://promptml.org/\n```\n## Example\n\n1. Create a PromptML file `character.pml` with the following content:\n\n```promptml\n@prompt\n    @context\n        You are a millitary general in Roman army.\n    @end\n\n    @objective\n        Describe a regular day in your life.\n    @end\n\n    @instructions\n        @step\n            Be cheeky and sarcastic.\n        @end\n    @end\n\n    @category\n        Simulation\n    @end\n@end\n```\nSee PromptML [documentation](https://www.promptml.org/) for more details about the syntax.\n\n2. Set your OpenAI API key as an environment variable:\n\n```bash\nexport OPEN_AI_API_KEY=your-openai-api-key\n```\n\nor if you are using Google GenAI.\n\n```bash\nexport GOOGLE_API_KEY=your-google-api-key\n```\nor if you are using Local Ollama GenAI, No API key is required.\n\n3. Run the PromptML file with the following command in terminal:\n\n```bash\nprocli -f character.pml -p ollama -m phi3\n```\n\nYou will see the respective output on terminal:\n\n```info\nAh, so you want a glimpse into the life of a Roman general, do you? Well, let me spin you a tale, dripping with sarcasm and cheeky remarks, because obviously, my life is a walk in the park. Shall we?\n\nThe day usually starts with the soothing sounds of soldiers clanging their swords and shields together at some ungodly hour. Rather than waking up to the gentle cooing of doves, I get to hear the charming war cries of recruits who still can't tell their gladius from their left foot. Delightful, isn't it?\n\nAfter I drag myself out of what I'm convinced is a sack filled with rocks they call a bed, it's straight to the strategy tent. Here, I enjoy the riveting discussions about which barbarian horde is threatening our borders this week. It's like choosing the lesser of two evils: invasions from the north or mutiny from the ranks. Decisions, decisions!\n\nNext on the agenda is overseeing training. Oh yes, I just love watching greenhorns stumble through basic drills. The way they handle their weapons \u2013 you'd think a lopsided stick had suddenly become the deadliest thing in their hands. But hey, a general's got to humor them, right?\n\nThen there's the daily feast of dried meat and stale bread, washed down with wine that's likely been used as paint thinner. Ah, the joys of Roman culinary delights. I'm sure Bacchus himself is weeping with laughter somewhere.\n\nAfternoons are reserved for dealing with the Senate's missives, those beautifully crafted scrolls filled with \u2018helpful\u2019 suggestions and veiled threats. It's like mail time with a hint of doomsday. And who can forget the thrill of addressing the legion, trying to maintain morale while standing in armor that weighs more than some of the new recruits?\n\nAs evening falls, I get to review the day's progress with my centurions, who conveniently bring me the freshest of problems right before bedtime. If I\u2019m lucky, I'll dodge an assassination attempt or two \u2013 keeps life exciting, don\u2019t you think?\n\nFinally, I retire for the night, eager to wake up and do it all over again. Really, what's not to love? So there you have it! Just another average day in the life of a Roman general \u2013 a blend of strategy, sarcasm, and just a dash of masochism.\n\nTime taken: 26.04 seconds\n```\n\n## Streaming & Non-streaming Responses\n\nBy default, procli streams the response from the GenAI model. You can force CLI tool to collect whole response by using the `--no-stream` flag.\n\n\n```bash\n# Streaming response\npromptml-cli -f character.pml -p google\n```\n\n```bash\n# Non-streaming response\npromptml-cli -f character.pml -p google --no-stream\n```\n\n## Raw Output\n\nYou can also get the raw output from the GenAI model by using the `--raw` flag. This will return the raw output from the model without any formatting.\n\n```bash\n# Raw Markdown output\npromptml-cli -f character.pml -p openai --raw\n```\n\nNote: Raw output is useful when you want to save the output to a file or pipe it to another command.\n\n## TODO\n- Add support for Claude, Cohere & A21 Labs GenAI models\n- Add tests\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A CLI tool to run PromptML scripts",
    "version": "1.0.1",
    "project_urls": {
        "Documentation": "https://github.com/narenaryan/promptml-cli/blob/main/README.md",
        "Issues": "https://github.com/narenaryan/promptml-cli/issues",
        "Source": "https://github.com/narenaryan/promptml-cli/"
    },
    "split_keywords": [
        "artificial-intelligence",
        " dsl",
        " generative-ai",
        " language",
        " prompt-engineering",
        " promptml",
        " promptml-cli"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4821689491a846407c129c5bacc428a5ef39d6d0c8fac9a75cd60676c5664789",
                "md5": "aa1aa0fb230abbf835d3f7a00dff4c73",
                "sha256": "e9fe9fec3975f51ba8e31f00c727eef975e11de3ae77de796fd2c277de0907d8"
            },
            "downloads": -1,
            "filename": "promptml_cli-1.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "aa1aa0fb230abbf835d3f7a00dff4c73",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 8011,
            "upload_time": "2024-07-23T04:59:26",
            "upload_time_iso_8601": "2024-07-23T04:59:26.345360Z",
            "url": "https://files.pythonhosted.org/packages/48/21/689491a846407c129c5bacc428a5ef39d6d0c8fac9a75cd60676c5664789/promptml_cli-1.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1cf56fe9c95557269d161d2c2e16c21722963ca33a014a0bb683760b9b3889d7",
                "md5": "8ab71ed6fd4742f2713947789cf63b44",
                "sha256": "2f2ed5887966c85b67fa6400201a18e063556185c5c1641a41219813c95fd069"
            },
            "downloads": -1,
            "filename": "promptml_cli-1.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "8ab71ed6fd4742f2713947789cf63b44",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 25455,
            "upload_time": "2024-07-23T04:59:27",
            "upload_time_iso_8601": "2024-07-23T04:59:27.715187Z",
            "url": "https://files.pythonhosted.org/packages/1c/f5/6fe9c95557269d161d2c2e16c21722963ca33a014a0bb683760b9b3889d7/promptml_cli-1.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-23 04:59:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "narenaryan",
    "github_project": "promptml-cli",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "promptml",
            "specs": [
                [
                    "==",
                    "0.6.1"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "1.33.0"
                ]
            ]
        },
        {
            "name": "rich",
            "specs": [
                [
                    "==",
                    "13.7.1"
                ]
            ]
        },
        {
            "name": "google-generativeai",
            "specs": [
                [
                    "==",
                    "0.6.0"
                ]
            ]
        },
        {
            "name": "click",
            "specs": [
                [
                    "==",
                    "8.1.7"
                ]
            ]
        }
    ],
    "lcname": "promptml-cli"
}
        
Elapsed time: 0.26306s