promptliy


Namepromptliy JSON
Version 1.0.2 PyPI version JSON
download
home_pageNone
SummaryPromptliy SDK for accessing Promptliy API
upload_time2025-07-10 20:36:44
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licensePromptliy SDK License Copyright © 2025 Promptliy Inc. This software is proprietary and protected under applicable copyright laws. By using this software, you agree to the following: 1. You may use this SDK: - Free of charge on the Free Tier of Promptliy services - For evaluation or development purposes - In production only with an active Promptliy subscription 2. Restrictions: - You may NOT sublicense, distribute, or reverse-engineer this software - You may NOT modify or resell this SDK or derivative works - You may NOT use this SDK outside the Promptliy API without a license 3. The SDK is provided “AS IS” without warranties. Promptliy Inc. shall not be held liable for any damages. By using this software, you accept these terms. For licensing inquiries or enterprise use, contact legal@promptliy.ai.
keywords promptliy prompts api client sdk ai python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # promptliy-client

A lightweight Python SDK to fetch and format AI prompts from [Promptliy.ai](https://promptliy.ai) using your project API key.

---

## Installation

```bash
pip install promptliy
```

> Optional (recommended for caching in long-running environments like servers):
```bash
pip install cachetools
```

---

## What is Promptliy?

[Promptliy](https://promptliy.ai) helps teams manage, version, and collaborate on production-ready AI prompts with live context, variables, version history, and client libraries for devs.

---

## Usage

### 1. Initialize the client

```python
from promptliy import PromptliyClient

promptliy_client = PromptliyClient(project_key="pl_sk_abc123yourkey")
```

### 2. Using a prompt with variables

This is the most common use case. Your prompt on Promptliy.ai contains `{{placeholders}}`.

```python
# Fetches the "onboarding-email" prompt and formats it with the provided values
output = promptliy_client.format("onboarding-email", {
    "name": "Ava",
    "product": "Promptliy"
})

print(output)
# Output: "Subject: Welcome, Ava! Hey Ava, Thanks for joining Promptliy..."```

### 3. Using a prompt without variables (static prompt)

If your prompt has no variables, you can simply call `.format()` with an empty dictionary. If you are using the latest version of the SDK, the second argument is optional.

```python
# Fetches a static prompt, like a system instruction or a fixed message.
# Both methods below are valid.

# Method A: Passing an empty dictionary (works on all versions)
system_prompt = promptliy_client.format("system-instruction", {})

# Method B: Omitting the values (more convenient, recommended)
system_prompt = promptliy_client.format("system-instruction")


print(system_prompt)
# Output: "You are a helpful AI assistant specializing in software development."
```

---

## Example: Use with LLMs

### OpenAI (ChatGPT, GPT-4)

```python
from openai import OpenAI
client = OpenAI(api_key="sk-...")

prompt_text = promptliy_client.format("chat-prompt", { "topic": "AI" })

response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": prompt_text}]
)

print(response.choices[0].message.content)
```

### Claude (Anthropic)

```python
from anthropic import Anthropic

anthropic = Anthropic(api_key="your-anthropic-key")

prompt_text = promptliy_client.format("claude-prompt", { "question": "What is PromptOps?" })

response = anthropic.messages.create(
    model="claude-3-opus-20240229",
    max_tokens=500,
    messages=[{"role": "user", "content": prompt_text}]
)

print(response.content)
```

### Google Gemini

```python
import google.generativeai as genai

genai.configure(api_key="your-gemini-api-key")

prompt_text = promptliy_client.format("gemini-prompt", { "question": "What is PromptOps?" })

response = genai.GenerativeModel("gemini-1.5-pro").generate_content(prompt_text)
print(response.text)
```

---

## Features

- Smart in-memory caching (via built-in dict or `cachetools`)
- Background refresh loop (auto-syncs every 30 seconds)
- Variable validation with `{{ name }}` support
- Works with all LLM APIs (OpenAI, Claude, Gemini, etc.)

---

## Example Prompt Template

```text
Subject: Welcome, {{ name }}!

Hey {{ name }},

Thanks for joining {{ product }}. We're thrilled to have you on board!
```

---

## Error Handling

```python
# ❌ Missing required variable
promptliy_client.format("onboarding-email", { "name": "Leo" })
# ➜ ValueError: Missing required variables: product
```

---

## License

This SDK is **commercial software** by Promptliy Inc.

By using this package, you agree to the terms in [`LICENSE.txt`](./LICENSE.txt).

- ✅ Free tier use is allowed
- 🚫 Production use requires a paid subscription

---

## Contact

- 🌐 [https://promptliy.ai](https://promptliy.ai)
- 📧 support@promptliy.ai

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "promptliy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "promptliy, prompts, api, client, sdk, ai, python",
    "author": null,
    "author_email": "Promptliy Inc <support@promptliy.ai>",
    "download_url": "https://files.pythonhosted.org/packages/bf/1e/49e8c4d13dce40146bb226a62378a4d155a880d46b51b31063d21be4dd22/promptliy-1.0.2.tar.gz",
    "platform": null,
    "description": "# promptliy-client\n\nA lightweight Python SDK to fetch and format AI prompts from [Promptliy.ai](https://promptliy.ai) using your project API key.\n\n---\n\n## Installation\n\n```bash\npip install promptliy\n```\n\n> Optional (recommended for caching in long-running environments like servers):\n```bash\npip install cachetools\n```\n\n---\n\n## What is Promptliy?\n\n[Promptliy](https://promptliy.ai) helps teams manage, version, and collaborate on production-ready AI prompts with live context, variables, version history, and client libraries for devs.\n\n---\n\n## Usage\n\n### 1. Initialize the client\n\n```python\nfrom promptliy import PromptliyClient\n\npromptliy_client = PromptliyClient(project_key=\"pl_sk_abc123yourkey\")\n```\n\n### 2. Using a prompt with variables\n\nThis is the most common use case. Your prompt on Promptliy.ai contains `{{placeholders}}`.\n\n```python\n# Fetches the \"onboarding-email\" prompt and formats it with the provided values\noutput = promptliy_client.format(\"onboarding-email\", {\n    \"name\": \"Ava\",\n    \"product\": \"Promptliy\"\n})\n\nprint(output)\n# Output: \"Subject: Welcome, Ava! Hey Ava, Thanks for joining Promptliy...\"```\n\n### 3. Using a prompt without variables (static prompt)\n\nIf your prompt has no variables, you can simply call `.format()` with an empty dictionary. If you are using the latest version of the SDK, the second argument is optional.\n\n```python\n# Fetches a static prompt, like a system instruction or a fixed message.\n# Both methods below are valid.\n\n# Method A: Passing an empty dictionary (works on all versions)\nsystem_prompt = promptliy_client.format(\"system-instruction\", {})\n\n# Method B: Omitting the values (more convenient, recommended)\nsystem_prompt = promptliy_client.format(\"system-instruction\")\n\n\nprint(system_prompt)\n# Output: \"You are a helpful AI assistant specializing in software development.\"\n```\n\n---\n\n## Example: Use with LLMs\n\n### OpenAI (ChatGPT, GPT-4)\n\n```python\nfrom openai import OpenAI\nclient = OpenAI(api_key=\"sk-...\")\n\nprompt_text = promptliy_client.format(\"chat-prompt\", { \"topic\": \"AI\" })\n\nresponse = client.chat.completions.create(\n    model=\"gpt-4\",\n    messages=[{\"role\": \"user\", \"content\": prompt_text}]\n)\n\nprint(response.choices[0].message.content)\n```\n\n### Claude (Anthropic)\n\n```python\nfrom anthropic import Anthropic\n\nanthropic = Anthropic(api_key=\"your-anthropic-key\")\n\nprompt_text = promptliy_client.format(\"claude-prompt\", { \"question\": \"What is PromptOps?\" })\n\nresponse = anthropic.messages.create(\n    model=\"claude-3-opus-20240229\",\n    max_tokens=500,\n    messages=[{\"role\": \"user\", \"content\": prompt_text}]\n)\n\nprint(response.content)\n```\n\n### Google Gemini\n\n```python\nimport google.generativeai as genai\n\ngenai.configure(api_key=\"your-gemini-api-key\")\n\nprompt_text = promptliy_client.format(\"gemini-prompt\", { \"question\": \"What is PromptOps?\" })\n\nresponse = genai.GenerativeModel(\"gemini-1.5-pro\").generate_content(prompt_text)\nprint(response.text)\n```\n\n---\n\n## Features\n\n- Smart in-memory caching (via built-in dict or `cachetools`)\n- Background refresh loop (auto-syncs every 30 seconds)\n- Variable validation with `{{ name }}` support\n- Works with all LLM APIs (OpenAI, Claude, Gemini, etc.)\n\n---\n\n## Example Prompt Template\n\n```text\nSubject: Welcome, {{ name }}!\n\nHey {{ name }},\n\nThanks for joining {{ product }}. We're thrilled to have you on board!\n```\n\n---\n\n## Error Handling\n\n```python\n# \u274c Missing required variable\npromptliy_client.format(\"onboarding-email\", { \"name\": \"Leo\" })\n# \u279c ValueError: Missing required variables: product\n```\n\n---\n\n## License\n\nThis SDK is **commercial software** by Promptliy Inc.\n\nBy using this package, you agree to the terms in [`LICENSE.txt`](./LICENSE.txt).\n\n- \u2705 Free tier use is allowed\n- \ud83d\udeab Production use requires a paid subscription\n\n---\n\n## Contact\n\n- \ud83c\udf10 [https://promptliy.ai](https://promptliy.ai)\n- \ud83d\udce7 support@promptliy.ai\n",
    "bugtrack_url": null,
    "license": "Promptliy SDK License\n        \n        Copyright \u00a9 2025 Promptliy Inc.\n        \n        This software is proprietary and protected under applicable copyright laws.\n        By using this software, you agree to the following:\n        \n        1. You may use this SDK:\n           - Free of charge on the Free Tier of Promptliy services\n           - For evaluation or development purposes\n           - In production only with an active Promptliy subscription\n        \n        2. Restrictions:\n           - You may NOT sublicense, distribute, or reverse-engineer this software\n           - You may NOT modify or resell this SDK or derivative works\n           - You may NOT use this SDK outside the Promptliy API without a license\n        \n        3. The SDK is provided \u201cAS IS\u201d without warranties. Promptliy Inc. shall not be held liable for any damages.\n        \n        By using this software, you accept these terms. For licensing inquiries or enterprise use, contact legal@promptliy.ai.\n        ",
    "summary": "Promptliy SDK for accessing Promptliy API",
    "version": "1.0.2",
    "project_urls": {
        "Homepage": "https://promptliy.ai"
    },
    "split_keywords": [
        "promptliy",
        " prompts",
        " api",
        " client",
        " sdk",
        " ai",
        " python"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "d0e95a3cfa3acd8a17ab09f3f3fd0cd53ff7c13cc862bcf44590601dacf55164",
                "md5": "fecd8d6f74a7b9a81c9f42ff805c5315",
                "sha256": "e124cf22e747efe05471002c414709037966faf1ee0da9f081b9f1b5ab98f5fd"
            },
            "downloads": -1,
            "filename": "promptliy-1.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "fecd8d6f74a7b9a81c9f42ff805c5315",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 5413,
            "upload_time": "2025-07-10T20:36:43",
            "upload_time_iso_8601": "2025-07-10T20:36:43.669301Z",
            "url": "https://files.pythonhosted.org/packages/d0/e9/5a3cfa3acd8a17ab09f3f3fd0cd53ff7c13cc862bcf44590601dacf55164/promptliy-1.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bf1e49e8c4d13dce40146bb226a62378a4d155a880d46b51b31063d21be4dd22",
                "md5": "142e57ae4f107933fdf52f065066a936",
                "sha256": "063c8a911344f15733cc2ffd77599ce0e368e011ed121288328aa0079f687bf6"
            },
            "downloads": -1,
            "filename": "promptliy-1.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "142e57ae4f107933fdf52f065066a936",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 4854,
            "upload_time": "2025-07-10T20:36:44",
            "upload_time_iso_8601": "2025-07-10T20:36:44.984741Z",
            "url": "https://files.pythonhosted.org/packages/bf/1e/49e8c4d13dce40146bb226a62378a4d155a880d46b51b31063d21be4dd22/promptliy-1.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-10 20:36:44",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "promptliy"
}
        
Elapsed time: 1.59368s