# Backprompt: AI Integration, Simplified
Tailor LLM outputs to your exact needs, no datasets or finetuning required.
## Installation
```bash
pip install backprompt
```
## Usage
#### Step 1: Solve for an optimal prompt
```python
from backprompt import BackpromptClient
# Initialize the client
client = BackpromptClient(
api_key='[your-api-key]',
api_url='[your-api-url]'
)
input_text = "The new smartphone boasts a 108MP camera and 5G capability."
desired_output = """
🌟 Tech Specs Breakdown 🌟
📸 Camera: 108MP (Ultra High-Res)
📡 Connectivity: 5G-enabled
💡 Key Benefit: Pro-level photography with lightning-fast uploads
"""
optimal_prompt = client.solve_prompt(input_text, desired_output)
print(f"Optimal prompt: {optimal_prompt}")
```
#### Step 2: Deploy the optimal prompt
```python
optimal_prompt.deploy(client)
```
#### Step 3: Generate using the deployed prompt
```python
prompt_vars = {"input_data": "The latest smartwatch features a 1.4-inch AMOLED display and 7-day battery life."}
completion = optimal_prompt.run(client, prompt_vars)
print(f"Generated response: {completion}")
```
## Key Features
1. **Prompt Optimization**: Generate optimal prompts based on input-output pairs.
2. **Prompt Deployment**: Deploy optimized prompts for quick access.
3. **Response Generation**: Generate responses using deployed prompts.
## Backprompt's Edge for Developers
- Prompt Engineering, Automated
- Rapid Iteration Cycle
- Model-Agnostic Customization
- Resource Efficiency
## Backprompt Across Domains
- Nuanced Sentiment Extraction
- Specialized Knowledge Base Q&A
- Consistent Code Snippet Generation
For more information, visit [backprompt.ai](https://backprompt.ai)
© 2024 backprompt
Raw data
{
"_id": null,
"home_page": "https://github.com/backprompt-ai/backprompt",
"name": "backprompt",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "backprompt, ai, llm, prompt engineering, api",
"author": "Backprompt",
"author_email": "Backprompt Team <dev@backprompt.ai>",
"download_url": "https://files.pythonhosted.org/packages/d5/d0/580e9e53459d7c1e345698ba06a1f8e94c7db62ab10473e8e9a610466ad6/backprompt-0.1.1.tar.gz",
"platform": null,
"description": "# Backprompt: AI Integration, Simplified\n\nTailor LLM outputs to your exact needs, no datasets or finetuning required.\n\n## Installation\n\n```bash\npip install backprompt\n```\n\n## Usage\n#### Step 1: Solve for an optimal prompt\n```python\nfrom backprompt import BackpromptClient\n\n# Initialize the client\nclient = BackpromptClient(\n api_key='[your-api-key]', \n api_url='[your-api-url]'\n)\n\n\ninput_text = \"The new smartphone boasts a 108MP camera and 5G capability.\"\ndesired_output = \"\"\"\n\ud83c\udf1f Tech Specs Breakdown \ud83c\udf1f\n\ud83d\udcf8 Camera: 108MP (Ultra High-Res)\n\ud83d\udce1 Connectivity: 5G-enabled\n\ud83d\udca1 Key Benefit: Pro-level photography with lightning-fast uploads\n\"\"\"\n\noptimal_prompt = client.solve_prompt(input_text, desired_output)\nprint(f\"Optimal prompt: {optimal_prompt}\")\n```\n\n#### Step 2: Deploy the optimal prompt\n```python\noptimal_prompt.deploy(client)\n```\n\n#### Step 3: Generate using the deployed prompt\n```python\nprompt_vars = {\"input_data\": \"The latest smartwatch features a 1.4-inch AMOLED display and 7-day battery life.\"}\ncompletion = optimal_prompt.run(client, prompt_vars)\n\nprint(f\"Generated response: {completion}\")\n```\n\n## Key Features\n\n1. **Prompt Optimization**: Generate optimal prompts based on input-output pairs.\n2. **Prompt Deployment**: Deploy optimized prompts for quick access.\n3. **Response Generation**: Generate responses using deployed prompts.\n\n## Backprompt's Edge for Developers\n\n- Prompt Engineering, Automated\n- Rapid Iteration Cycle\n- Model-Agnostic Customization\n- Resource Efficiency\n\n## Backprompt Across Domains\n\n- Nuanced Sentiment Extraction\n- Specialized Knowledge Base Q&A\n- Consistent Code Snippet Generation\n\nFor more information, visit [backprompt.ai](https://backprompt.ai)\n\n\u00a9 2024 backprompt\n",
"bugtrack_url": null,
"license": null,
"summary": "AI Integration, Simplified. Tailor LLM outputs to your exact needs, no datasets or finetuning required.",
"version": "0.1.1",
"project_urls": {
"Homepage": "https://backprompt.ai",
"Repository": "https://github.com/backprompt-ai/backprompt"
},
"split_keywords": [
"backprompt",
" ai",
" llm",
" prompt engineering",
" api"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "10fe3462876d7c66bf32317db0739c993d4766ff7c0aeebe7ea4a95cb29e91cc",
"md5": "2deb773de849194aa50765c7e8be536c",
"sha256": "0e664744d0e425fcd4679a5a748fc28ba0f1beff797661aaa4fd07320ab2ed24"
},
"downloads": -1,
"filename": "backprompt-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2deb773de849194aa50765c7e8be536c",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 3939,
"upload_time": "2024-08-15T21:50:24",
"upload_time_iso_8601": "2024-08-15T21:50:24.025432Z",
"url": "https://files.pythonhosted.org/packages/10/fe/3462876d7c66bf32317db0739c993d4766ff7c0aeebe7ea4a95cb29e91cc/backprompt-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "d5d0580e9e53459d7c1e345698ba06a1f8e94c7db62ab10473e8e9a610466ad6",
"md5": "dae3eaee7669668f623dc7f7ae800d4f",
"sha256": "07aee86c3da48abdd477ef24bcfbcd20217613289c1ac6ac0898c8a632581415"
},
"downloads": -1,
"filename": "backprompt-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "dae3eaee7669668f623dc7f7ae800d4f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 3861,
"upload_time": "2024-08-15T21:50:25",
"upload_time_iso_8601": "2024-08-15T21:50:25.549602Z",
"url": "https://files.pythonhosted.org/packages/d5/d0/580e9e53459d7c1e345698ba06a1f8e94c7db62ab10473e8e9a610466ad6/backprompt-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-15 21:50:25",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "backprompt-ai",
"github_project": "backprompt",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "backprompt"
}