# Python Prompt Manager
A lightweight, extensible prompt management system for LLM applications. Centralize and version your prompts while keeping your codebase clean.
[](https://badge.fury.io/py/python-prompt-manager)
[](https://pypi.org/project/python-prompt-manager/)
[](https://opensource.org/licenses/MIT)
## Why Use This?
Managing prompts for LLM applications can quickly become messy. Hardcoded strings, scattered prompt files / urls, and unclear versioning make maintenance difficult. This package solves these problems by providing a clean, centralized way to manage your prompts.
## Features
- **Multiple Storage Backends**: Store prompts in OpenAI's system, local files, or create your own storage extension
- **Environment-Based Configuration**: No hardcoded secrets or paths in your code
- **Flexible Caching**: Reduce API calls with configurable caching
- **Framework Agnostic**: Use with any Python framework or standalone scripts
- **Type Safe**: Full type hints for better development experience
- **Extensible**: Easy to add new storage backends
## Installation
```bash
# Basic installation
pip install python-prompt-manager
# With OpenAI support
pip install python-prompt-manager[openai]
# With all optional dependencies
pip install python-prompt-manager[all]
```
## Quick Start
### Basic Usage
```python
from prompt_manager import get_prompt
# Get a prompt (reads from configured source)
prompt = get_prompt("welcome_message")
print(prompt)
```
### With Variables
```python
# Use template variables in your prompts
prompt = get_prompt(
"greeting",
variables={"name": "Alice", "day": "Monday"}
)
# "Hello Alice! Happy Monday!"
```
## Configuration
Configure your prompts using a simple Python dictionary:
```python
PROMPT_CONFIG = {
"prompts": {
"welcome": {
"source": "openai",
"id": "pmpt_1234567890",
"version": "1.0"
},
"greeting": {
"source": "local",
"path": "greeting.txt"
},
"analysis": {
"source": "openai",
"id": "pmpt_0987654321",
"cache_ttl": 300 # 5 minutes
}
},
"sources": {
"openai": {
"api_key": os.getenv("OPENAI_API_KEY"), # Keep secrets in env vars
"timeout": 30,
"max_retries": 3
},
"local": {
"base_dir": "./prompts"
}
}
}
# Initialize with your config
from prompt_manager import PromptManager
pm = PromptManager(PROMPT_CONFIG)
```
### Django Configuration
In your Django settings:
```python
# settings.py
PROMPT_MANAGER = {
"prompts": {
"welcome": {"source": "openai", "id": "pmpt_123"},
"email_template": {"source": "local", "path": "emails/welcome.txt"}
},
"sources": {
"openai": {
"api_key": env("OPENAI_API_KEY")
}
}
}
```
## Usage Examples
### Simple Example
```python
from prompt_manager import PromptManager
# Initialize with config
pm = PromptManager({
"prompts": {
"welcome": {"source": "openai", "id": "pmpt_123"},
"goodbye": {"source": "local", "path": "goodbye.txt"}
}
})
# Get prompts
welcome = pm.get("welcome")
goodbye = pm.get("goodbye")
```
### With Default Fallback
```python
# Provide a default if prompt is not found
prompt = pm.get("optional_prompt", default="This is a fallback prompt")
```
### Dynamic Configuration
```python
# Load config from a file
import json
with open("prompts.json") as f:
config = json.load(f)
pm = PromptManager(config)
```
### Template Variables
```python
# Configure a prompt with variables
config = {
"prompts": {
"greeting": {"source": "local", "path": "greeting.txt"}
}
}
pm = PromptManager(config)
# Apply variables when retrieving
prompt = pm.get(
"greeting",
variables={"name": "Alice", "app_name": "AwesomeApp"}
)
# "Hello Alice! Welcome to AwesomeApp."
```
## Django Integration
Add your prompt configuration to settings:
```python
# settings.py
PROMPT_MANAGER = {
"prompts": {
"welcome_email": {"source": "openai", "id": "pmpt_email_123"},
"user_greeting": {"source": "local", "path": "templates/greeting.txt"},
"error_message": {"source": "openai", "id": "pmpt_error_456"}
},
"sources": {
"openai": {"api_key": env("OPENAI_API_KEY")}
}
}
# Optional: Add the Django app for additional features
INSTALLED_APPS = [
...
'prompt_manager.integrations.django', # Optional
]
```
Use in your views:
```python
# views.py
from django.conf import settings
from prompt_manager import PromptManager
# Initialize once
pm = PromptManager(settings.PROMPT_MANAGER)
def my_view(request):
prompt = pm.get("welcome_email", variables={"user": request.user.name})
# Use prompt with your LLM
...
```
## Validation
By default, prompts are validated when first accessed. To validate all prompts on startup:
```python
config = {
"prompts": {...},
"validate_on_startup": True # Validate all prompts exist
}
pm = PromptManager(config)
```
## Advanced Usage
### Error Handling
```python
from prompt_manager import PromptManager, PromptNotFoundError
pm = PromptManager(config)
try:
prompt = pm.get("my_prompt")
except PromptNotFoundError:
# Handle missing prompt
logger.error("Prompt not found")
except Exception as e:
# Handle other errors
logger.error(f"Error loading prompt: {e}")
```
### Caching
```python
# Configure cache TTL per prompt
config = {
"prompts": {
"static_prompt": {"source": "local", "path": "static.txt"}, # Uses default cache
"dynamic_prompt": {"source": "openai", "id": "pmpt_123", "cache_ttl": 0} # No cache
},
"cache_ttl": 3600 # Default 1 hour
}
# Clear cache manually
pm.clear_cache()
```
### Custom Sources
Extend the base class to add new sources:
```python
from prompt_manager.sources.base import BasePromptSource
class DatabaseSource(BasePromptSource):
def fetch(self, config):
# Your implementation
return prompt_content
```
## Best Practices
1. **Keep Secrets in Environment Variables**: API keys should never be in code
2. **Use Clear Naming**: Choose descriptive names for your prompts
3. **Set Appropriate Cache TTLs**: Static prompts can cache longer than dynamic ones
4. **Handle Errors Gracefully**: Always provide fallbacks for critical prompts
5. **Version Your Prompts**: Use the version field to track prompt iterations
## Development
```bash
# Clone the repository
git clone https://github.com/yourusername/python-prompt-manager.git
cd python-prompt-manager
# Install in development mode
pip install -e ".[dev]"
# Run tests
pytest
# Run linting
black src tests
flake8 src tests
mypy src
```
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/AmazingFeature`)
3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)
4. Push to the branch (`git push origin feature/AmazingFeature`)
5. Open a Pull Request
## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Acknowledgments
- Built for the modern LLM application stack
- Designed with production use in mind
Raw data
{
"_id": null,
"home_page": "https://github.com/yourusername/python-prompt-manager",
"name": "python-prompt-manager",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": "prompt, prompt-management, llm, openai, ai",
"author": "Your Name",
"author_email": "Andrew OHarney <noemail@enodomain.com>",
"download_url": "https://files.pythonhosted.org/packages/a3/68/c674dd6cf5f852cae20c9d9ae8cf68ce648847edf3746cfecbe9d6cbfef0/python_prompt_manager-0.2.0.tar.gz",
"platform": null,
"description": "# Python Prompt Manager\n\nA lightweight, extensible prompt management system for LLM applications. Centralize and version your prompts while keeping your codebase clean.\n\n[](https://badge.fury.io/py/python-prompt-manager)\n[](https://pypi.org/project/python-prompt-manager/)\n[](https://opensource.org/licenses/MIT)\n\n## Why Use This?\n\nManaging prompts for LLM applications can quickly become messy. Hardcoded strings, scattered prompt files / urls, and unclear versioning make maintenance difficult. This package solves these problems by providing a clean, centralized way to manage your prompts.\n\n## Features\n\n- **Multiple Storage Backends**: Store prompts in OpenAI's system, local files, or create your own storage extension\n- **Environment-Based Configuration**: No hardcoded secrets or paths in your code\n- **Flexible Caching**: Reduce API calls with configurable caching\n- **Framework Agnostic**: Use with any Python framework or standalone scripts\n- **Type Safe**: Full type hints for better development experience\n- **Extensible**: Easy to add new storage backends\n\n## Installation\n\n```bash\n# Basic installation\npip install python-prompt-manager\n\n# With OpenAI support\npip install python-prompt-manager[openai]\n\n# With all optional dependencies\npip install python-prompt-manager[all]\n```\n\n## Quick Start\n\n### Basic Usage\n\n```python\nfrom prompt_manager import get_prompt\n\n# Get a prompt (reads from configured source)\nprompt = get_prompt(\"welcome_message\")\nprint(prompt)\n```\n\n### With Variables\n\n```python\n# Use template variables in your prompts\nprompt = get_prompt(\n \"greeting\", \n variables={\"name\": \"Alice\", \"day\": \"Monday\"}\n)\n# \"Hello Alice! Happy Monday!\"\n```\n\n## Configuration\n\nConfigure your prompts using a simple Python dictionary:\n\n```python\nPROMPT_CONFIG = {\n \"prompts\": {\n \"welcome\": {\n \"source\": \"openai\",\n \"id\": \"pmpt_1234567890\",\n \"version\": \"1.0\"\n },\n \"greeting\": {\n \"source\": \"local\",\n \"path\": \"greeting.txt\"\n },\n \"analysis\": {\n \"source\": \"openai\",\n \"id\": \"pmpt_0987654321\",\n \"cache_ttl\": 300 # 5 minutes\n }\n },\n \"sources\": {\n \"openai\": {\n \"api_key\": os.getenv(\"OPENAI_API_KEY\"), # Keep secrets in env vars\n \"timeout\": 30,\n \"max_retries\": 3\n },\n \"local\": {\n \"base_dir\": \"./prompts\"\n }\n }\n}\n\n# Initialize with your config\nfrom prompt_manager import PromptManager\npm = PromptManager(PROMPT_CONFIG)\n```\n\n### Django Configuration\n\nIn your Django settings:\n\n```python\n# settings.py\nPROMPT_MANAGER = {\n \"prompts\": {\n \"welcome\": {\"source\": \"openai\", \"id\": \"pmpt_123\"},\n \"email_template\": {\"source\": \"local\", \"path\": \"emails/welcome.txt\"}\n },\n \"sources\": {\n \"openai\": {\n \"api_key\": env(\"OPENAI_API_KEY\")\n }\n }\n}\n```\n\n## Usage Examples\n\n### Simple Example\n\n```python\nfrom prompt_manager import PromptManager\n\n# Initialize with config\npm = PromptManager({\n \"prompts\": {\n \"welcome\": {\"source\": \"openai\", \"id\": \"pmpt_123\"},\n \"goodbye\": {\"source\": \"local\", \"path\": \"goodbye.txt\"}\n }\n})\n\n# Get prompts\nwelcome = pm.get(\"welcome\")\ngoodbye = pm.get(\"goodbye\")\n```\n\n### With Default Fallback\n\n```python\n# Provide a default if prompt is not found\nprompt = pm.get(\"optional_prompt\", default=\"This is a fallback prompt\")\n```\n\n### Dynamic Configuration\n\n```python\n# Load config from a file\nimport json\n\nwith open(\"prompts.json\") as f:\n config = json.load(f)\n\npm = PromptManager(config)\n```\n\n### Template Variables\n\n```python\n# Configure a prompt with variables\nconfig = {\n \"prompts\": {\n \"greeting\": {\"source\": \"local\", \"path\": \"greeting.txt\"}\n }\n}\n\npm = PromptManager(config)\n\n# Apply variables when retrieving\nprompt = pm.get(\n \"greeting\",\n variables={\"name\": \"Alice\", \"app_name\": \"AwesomeApp\"}\n)\n# \"Hello Alice! Welcome to AwesomeApp.\"\n```\n\n## Django Integration\n\nAdd your prompt configuration to settings:\n\n```python\n# settings.py\nPROMPT_MANAGER = {\n \"prompts\": {\n \"welcome_email\": {\"source\": \"openai\", \"id\": \"pmpt_email_123\"},\n \"user_greeting\": {\"source\": \"local\", \"path\": \"templates/greeting.txt\"},\n \"error_message\": {\"source\": \"openai\", \"id\": \"pmpt_error_456\"}\n },\n \"sources\": {\n \"openai\": {\"api_key\": env(\"OPENAI_API_KEY\")}\n }\n}\n\n# Optional: Add the Django app for additional features\nINSTALLED_APPS = [\n ...\n 'prompt_manager.integrations.django', # Optional\n]\n```\n\nUse in your views:\n\n```python\n# views.py\nfrom django.conf import settings\nfrom prompt_manager import PromptManager\n\n# Initialize once\npm = PromptManager(settings.PROMPT_MANAGER)\n\ndef my_view(request):\n prompt = pm.get(\"welcome_email\", variables={\"user\": request.user.name})\n # Use prompt with your LLM\n ...\n```\n\n## Validation\n\nBy default, prompts are validated when first accessed. To validate all prompts on startup:\n\n```python\nconfig = {\n \"prompts\": {...},\n \"validate_on_startup\": True # Validate all prompts exist\n}\n\npm = PromptManager(config)\n```\n\n## Advanced Usage\n\n### Error Handling\n\n```python\nfrom prompt_manager import PromptManager, PromptNotFoundError\n\npm = PromptManager(config)\n\ntry:\n prompt = pm.get(\"my_prompt\")\nexcept PromptNotFoundError:\n # Handle missing prompt\n logger.error(\"Prompt not found\")\nexcept Exception as e:\n # Handle other errors\n logger.error(f\"Error loading prompt: {e}\")\n```\n\n### Caching\n\n```python\n# Configure cache TTL per prompt\nconfig = {\n \"prompts\": {\n \"static_prompt\": {\"source\": \"local\", \"path\": \"static.txt\"}, # Uses default cache\n \"dynamic_prompt\": {\"source\": \"openai\", \"id\": \"pmpt_123\", \"cache_ttl\": 0} # No cache\n },\n \"cache_ttl\": 3600 # Default 1 hour\n}\n\n# Clear cache manually\npm.clear_cache()\n```\n\n### Custom Sources\n\nExtend the base class to add new sources:\n\n```python\nfrom prompt_manager.sources.base import BasePromptSource\n\nclass DatabaseSource(BasePromptSource):\n def fetch(self, config):\n # Your implementation\n return prompt_content\n```\n\n## Best Practices\n\n1. **Keep Secrets in Environment Variables**: API keys should never be in code\n2. **Use Clear Naming**: Choose descriptive names for your prompts\n3. **Set Appropriate Cache TTLs**: Static prompts can cache longer than dynamic ones\n4. **Handle Errors Gracefully**: Always provide fallbacks for critical prompts\n5. **Version Your Prompts**: Use the version field to track prompt iterations\n\n## Development\n\n```bash\n# Clone the repository\ngit clone https://github.com/yourusername/python-prompt-manager.git\ncd python-prompt-manager\n\n# Install in development mode\npip install -e \".[dev]\"\n\n# Run tests\npytest\n\n# Run linting\nblack src tests\nflake8 src tests\nmypy src\n```\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n1. Fork the repository\n2. Create your feature branch (`git checkout -b feature/AmazingFeature`)\n3. Commit your changes (`git commit -m 'Add some AmazingFeature'`)\n4. Push to the branch (`git push origin feature/AmazingFeature`)\n5. Open a Pull Request\n\n## License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## Acknowledgments\n\n- Built for the modern LLM application stack\n- Designed with production use in mind\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Centralized prompt management for LLM applications",
"version": "0.2.0",
"project_urls": {
"Documentation": "https://python-prompt-manager.readthedocs.io",
"Homepage": "https://github.com/yourusername/python-prompt-manager",
"Issues": "https://github.com/yourusername/python-prompt-manager/issues",
"Repository": "https://github.com/yourusername/python-prompt-manager"
},
"split_keywords": [
"prompt",
" prompt-management",
" llm",
" openai",
" ai"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "07b3d847f42b074dd561be79fd6e78b09f27126c04ecc6dd6fde4656ffbd6bb4",
"md5": "4e16c7ecbde1d542e2dc9a57457fad50",
"sha256": "ae2462bf9a0f5b640c71063dbea14f4cf2be4b15fb1297616b1ca109aa8c3429"
},
"downloads": -1,
"filename": "python_prompt_manager-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4e16c7ecbde1d542e2dc9a57457fad50",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 26449,
"upload_time": "2025-07-22T13:30:55",
"upload_time_iso_8601": "2025-07-22T13:30:55.779652Z",
"url": "https://files.pythonhosted.org/packages/07/b3/d847f42b074dd561be79fd6e78b09f27126c04ecc6dd6fde4656ffbd6bb4/python_prompt_manager-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "a368c674dd6cf5f852cae20c9d9ae8cf68ce648847edf3746cfecbe9d6cbfef0",
"md5": "aaf29700967adba2b00c29346f22ec07",
"sha256": "a8c7f03d65201bb111d75e038d9b27692ffc99c11698397a83e2dc0aabf01d60"
},
"downloads": -1,
"filename": "python_prompt_manager-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "aaf29700967adba2b00c29346f22ec07",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 29406,
"upload_time": "2025-07-22T13:30:57",
"upload_time_iso_8601": "2025-07-22T13:30:57.480431Z",
"url": "https://files.pythonhosted.org/packages/a3/68/c674dd6cf5f852cae20c9d9ae8cf68ce648847edf3746cfecbe9d6cbfef0/python_prompt_manager-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-22 13:30:57",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yourusername",
"github_project": "python-prompt-manager",
"github_not_found": true,
"lcname": "python-prompt-manager"
}