# Hyrex Python SDK
Hyrex is a modern, open-source task orchestration framework built on PostgreSQL. It provides powerful features for distributed task processing, workflow management, and asynchronous job execution.
## Features
- **Task Orchestration**: Define and execute distributed tasks with automatic retries, timeouts, and error handling
- **Workflow Support**: Build complex DAG-based workflows with dependencies
- **Queue Management**: Route tasks to different queues with separate worker pools
- **Key-Value Store**: Built-in distributed key-value storage for task coordination
- **Task Context**: Rich execution context with task metadata and hierarchy tracking
- **Cron Scheduling**: Schedule recurring tasks with cron expressions
- **Hyrex Studio**: Web-based UI for monitoring and debugging (available at https://local.hyrex.studio)
- **Type Safety**: Full type hints and Pydantic model validation
## Installation
```bash
pip install hyrex
```
## Quick Start
### 1. Initialize a New Project
Use the interactive `hyrex init` command to set up a new project:
```bash
hyrex init
```
This will guide you through:
- Choosing a project name
- Selecting between PostgreSQL (self-hosted) or Hyrex Cloud
- Creating project files (`.env`, `hyrex_app.py`, `tasks.py`, `requirements.txt`, `Dockerfile`)
For manual database initialization (if needed):
```bash
export HYREX_DATABASE_URL="postgresql://user:password@localhost/dbname"
hyrex init-db
```
### 2. Project Structure
After running `hyrex init`, you'll have:
```
your-project/
├── .env # Environment configuration
├── hyrex_app.py # Hyrex app configuration
├── tasks.py # Task definitions
├── requirements.txt # Python dependencies
└── Dockerfile # Container configuration
```
Example `tasks.py`:
```python
from hyrex import HyrexRegistry
from pydantic import BaseModel
hy = HyrexRegistry()
class EmailContext(BaseModel):
to: str
subject: str
body: str
@hy.task
def send_email(context: EmailContext):
print(f"Sending email to {context.to}")
# Your email logic here
return {"sent": True}
```
### 3. Send Tasks
Queue tasks for execution:
```python
# Send a task to the default queue
send_email.send(EmailContext(
to="user@example.com",
subject="Welcome!",
body="Thanks for signing up"
))
# Send with custom configuration
send_email.with_config(
queue="high-priority",
max_retries=3,
timeout_seconds=30
).send(EmailContext(...))
```
### 4. Run Workers
Start workers to process tasks:
```bash
hyrex run-worker hyrex_app:app
```
## Core Features
### Task Decorator
The `@task` decorator transforms functions into distributed tasks:
```python
@hy.task(
queue="processing", # Target queue (str or HyrexQueue object)
max_retries=3, # Maximum retry attempts (default: 0)
timeout_seconds=300, # Task timeout in seconds
priority=5, # Priority 1-10 (higher = more important)
on_error=error_handler, # Error callback function
retry_backoff=lambda n: n*10 # Backoff strategy function
)
def process_data(context: ProcessContext):
# Task implementation
pass
```
### Task Context
Access rich execution context within tasks:
```python
from hyrex import get_hyrex_context
@hy.task
def contextual_task():
context = get_hyrex_context()
print(f"Task ID: {context.task_id}")
print(f"Task Name: {context.task_name}")
print(f"Attempt: {context.attempt_number}/{context.max_retries}")
print(f"Queue: {context.queue}")
print(f"Parent Task: {context.parent_id}")
print(f"Root Task: {context.root_id}")
```
### Key-Value Store
Use HyrexKV for distributed state management:
```python
from hyrex import HyrexKV
@hy.task
def process_with_state(user_id: str):
# Store state
HyrexKV.set(f"user:{user_id}:status", "processing")
# Retrieve state
status = HyrexKV.get(f"user:{user_id}:status")
# Delete state
HyrexKV.delete(f"user:{user_id}:status")
```
**Note**: HyrexKV stores string values up to 1MB in size.
### Workflows
Build complex DAG-based workflows:
```python
@hy.task
def extract_data():
return {"data": "extracted"}
@hy.task
def transform_data():
return {"data": "transformed"}
@hy.task
def load_data():
return {"data": "loaded"}
class ETLWorkflowArgs(BaseModel):
source: str
destination: str
@hy.workflow(
queue="etl",
timeout_seconds=3600,
workflow_arg_schema=ETLWorkflowArgs
)
def etl_workflow():
# Define workflow DAG
extract_data >> transform_data >> load_data
# Parallel execution
extract_data >> [transform_data, validate_data] >> load_data
# With custom config
extract_data >> transform_data.with_config(queue="cpu-intensive") >> load_data
```
Send workflows:
```python
etl_workflow.send(ETLWorkflowArgs(
source="s3://input",
destination="s3://output"
))
```
Access workflow context:
```python
from hyrex import get_hyrex_workflow_context
@hy.task
def workflow_task():
wf_context = get_hyrex_workflow_context()
# Access workflow arguments
args = wf_context.workflow_args
# Access other task results
extract_result = wf_context.durable_runs.get("extract_data")
if extract_result:
extract_result.refresh() # Get latest status
```
### Dynamic Task Configuration
Use `with_config()` to modify task behavior at runtime:
```python
# Define base task
@hy.task(queue="default", max_retries=1)
def flexible_task(data: str):
return process(data)
# Override configuration when sending
flexible_task.with_config(
queue="high-priority",
max_retries=5,
timeout_seconds=60,
priority=10
).send("important-data")
```
### Cron Scheduling
Schedule recurring tasks:
```python
@hy.task(cron="0 2 * * *") # Daily at 2 AM
def daily_cleanup():
# Cleanup logic
pass
# Tasks with default arguments can also be scheduled
@hy.task(cron="0 0 * * 0") # Weekly on Sunday
def weekly_backup(retention_days: int = 30):
# Backup logic with configurable retention
pass
@hy.workflow(cron="0 0 * * 0") # Weekly on Sunday
def weekly_report():
generate_report >> send_report
```
**Note**: Cron-scheduled tasks must have no arguments or all arguments must have default values.
### Error Handling
Implement custom error handlers:
```python
def handle_task_error(error: Exception):
# Log error, send alerts, etc.
print(f"Task failed: {error}")
@hy.task(
on_error=handle_task_error,
max_retries=3,
retry_backoff=lambda attempt: 2 ** attempt # Exponential backoff
)
def risky_task():
# Task that might fail
pass
```
## CLI Commands
- `hyrex init` - Interactive project initialization wizard
- `hyrex init-db` - Initialize the database schema
- `hyrex run-worker <module:instance>` - Start a worker process
- `hyrex studio` - Start Hyrex Studio server
## Monitoring with Hyrex Studio
Hyrex Studio provides a web interface for monitoring your tasks and workflows:
1. Start the studio server:
```bash
hyrex studio
```
2. Open https://local.hyrex.studio in your browser
3. Monitor task execution, view logs, and inspect your data
## Configuration
Hyrex uses environment variables for configuration:
- `HYREX_DATABASE_URL` - PostgreSQL connection string (required)
- `STUDIO_PORT` - Port for Hyrex Studio (default: 1337)
- `STUDIO_VERBOSE` - Enable verbose logging for Studio (default: false)
## Advanced Usage
### Registry Inheritance
Share task definitions across modules:
```python
# common_tasks.py
common_registry = HyrexRegistry()
@common_registry.task
def shared_task():
pass
# main.py
from common_tasks import common_registry
hy = HyrexRegistry()
hy.add_registry(common_registry) # Include all tasks from common_registry
```
### Task Composition
Build complex task hierarchies:
```python
@hy.task
def parent_task(count: int):
# Spawn child tasks
for i in range(count):
child_task.send(index=i)
# Tasks maintain parent-child relationships
# visible in context.parent_id and context.root_id
```
### Idempotency
Ensure tasks run only once:
```python
@hy.task
def idempotent_task(user_id: str):
# Process user
pass
# Using idempotency key
idempotent_task.with_config(
idempotency_key=f"process-user-{user_id}"
).send(user_id="123")
```
## Requirements
- Python 3.11+
- PostgreSQL 12+
- Required Python packages are automatically installed with pip
## License
Apache License 2.0
## Links
- [GitHub Repository](https://github.com/hyrex-labs/hyrex-python)
- [Documentation](https://github.com/hyrex-labs/hyrex-python)
- [Hyrex Studio](https://local.hyrex.studio)
Raw data
{
"_id": null,
"home_page": null,
"name": "hyrex",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "hyrex, task, queue, async",
"author": null,
"author_email": "Trevor Reed <trevor@hyrex.io>, Mark Dawson <mark@hyrex.io>",
"download_url": "https://files.pythonhosted.org/packages/76/7e/aede0787df9dd4b71ffc93d05fbcac669b0f579602d3c64c483ec267266b/hyrex-0.10.8.tar.gz",
"platform": null,
"description": "# Hyrex Python SDK\n\nHyrex is a modern, open-source task orchestration framework built on PostgreSQL. It provides powerful features for distributed task processing, workflow management, and asynchronous job execution.\n\n## Features\n\n- **Task Orchestration**: Define and execute distributed tasks with automatic retries, timeouts, and error handling\n- **Workflow Support**: Build complex DAG-based workflows with dependencies\n- **Queue Management**: Route tasks to different queues with separate worker pools\n- **Key-Value Store**: Built-in distributed key-value storage for task coordination\n- **Task Context**: Rich execution context with task metadata and hierarchy tracking\n- **Cron Scheduling**: Schedule recurring tasks with cron expressions\n- **Hyrex Studio**: Web-based UI for monitoring and debugging (available at https://local.hyrex.studio)\n- **Type Safety**: Full type hints and Pydantic model validation\n\n## Installation\n\n```bash\npip install hyrex\n```\n\n## Quick Start\n\n### 1. Initialize a New Project\n\nUse the interactive `hyrex init` command to set up a new project:\n\n```bash\nhyrex init\n```\n\nThis will guide you through:\n- Choosing a project name\n- Selecting between PostgreSQL (self-hosted) or Hyrex Cloud\n- Creating project files (`.env`, `hyrex_app.py`, `tasks.py`, `requirements.txt`, `Dockerfile`)\n\nFor manual database initialization (if needed):\n\n```bash\nexport HYREX_DATABASE_URL=\"postgresql://user:password@localhost/dbname\"\nhyrex init-db\n```\n\n### 2. Project Structure\n\nAfter running `hyrex init`, you'll have:\n\n```\nyour-project/\n\u251c\u2500\u2500 .env # Environment configuration\n\u251c\u2500\u2500 hyrex_app.py # Hyrex app configuration\n\u251c\u2500\u2500 tasks.py # Task definitions\n\u251c\u2500\u2500 requirements.txt # Python dependencies\n\u2514\u2500\u2500 Dockerfile # Container configuration\n```\n\nExample `tasks.py`:\n\n```python\nfrom hyrex import HyrexRegistry\nfrom pydantic import BaseModel\n\nhy = HyrexRegistry()\n\nclass EmailContext(BaseModel):\n to: str\n subject: str\n body: str\n\n@hy.task\ndef send_email(context: EmailContext):\n print(f\"Sending email to {context.to}\")\n # Your email logic here\n return {\"sent\": True}\n```\n\n### 3. Send Tasks\n\nQueue tasks for execution:\n\n```python\n# Send a task to the default queue\nsend_email.send(EmailContext(\n to=\"user@example.com\",\n subject=\"Welcome!\",\n body=\"Thanks for signing up\"\n))\n\n# Send with custom configuration\nsend_email.with_config(\n queue=\"high-priority\",\n max_retries=3,\n timeout_seconds=30\n).send(EmailContext(...))\n```\n\n### 4. Run Workers\n\nStart workers to process tasks:\n\n```bash\nhyrex run-worker hyrex_app:app\n```\n\n## Core Features\n\n### Task Decorator\n\nThe `@task` decorator transforms functions into distributed tasks:\n\n```python\n@hy.task(\n queue=\"processing\", # Target queue (str or HyrexQueue object)\n max_retries=3, # Maximum retry attempts (default: 0)\n timeout_seconds=300, # Task timeout in seconds\n priority=5, # Priority 1-10 (higher = more important)\n on_error=error_handler, # Error callback function\n retry_backoff=lambda n: n*10 # Backoff strategy function\n)\ndef process_data(context: ProcessContext):\n # Task implementation\n pass\n```\n\n### Task Context\n\nAccess rich execution context within tasks:\n\n```python\nfrom hyrex import get_hyrex_context\n\n@hy.task\ndef contextual_task():\n context = get_hyrex_context()\n\n print(f\"Task ID: {context.task_id}\")\n print(f\"Task Name: {context.task_name}\")\n print(f\"Attempt: {context.attempt_number}/{context.max_retries}\")\n print(f\"Queue: {context.queue}\")\n print(f\"Parent Task: {context.parent_id}\")\n print(f\"Root Task: {context.root_id}\")\n```\n\n### Key-Value Store\n\nUse HyrexKV for distributed state management:\n\n```python\nfrom hyrex import HyrexKV\n\n@hy.task\ndef process_with_state(user_id: str):\n # Store state\n HyrexKV.set(f\"user:{user_id}:status\", \"processing\")\n\n # Retrieve state\n status = HyrexKV.get(f\"user:{user_id}:status\")\n\n # Delete state\n HyrexKV.delete(f\"user:{user_id}:status\")\n```\n\n**Note**: HyrexKV stores string values up to 1MB in size.\n\n### Workflows\n\nBuild complex DAG-based workflows:\n\n```python\n@hy.task\ndef extract_data():\n return {\"data\": \"extracted\"}\n\n@hy.task\ndef transform_data():\n return {\"data\": \"transformed\"}\n\n@hy.task\ndef load_data():\n return {\"data\": \"loaded\"}\n\nclass ETLWorkflowArgs(BaseModel):\n source: str\n destination: str\n\n@hy.workflow(\n queue=\"etl\",\n timeout_seconds=3600,\n workflow_arg_schema=ETLWorkflowArgs\n)\ndef etl_workflow():\n # Define workflow DAG\n extract_data >> transform_data >> load_data\n\n # Parallel execution\n extract_data >> [transform_data, validate_data] >> load_data\n\n # With custom config\n extract_data >> transform_data.with_config(queue=\"cpu-intensive\") >> load_data\n```\n\nSend workflows:\n\n```python\netl_workflow.send(ETLWorkflowArgs(\n source=\"s3://input\",\n destination=\"s3://output\"\n))\n```\n\nAccess workflow context:\n\n```python\nfrom hyrex import get_hyrex_workflow_context\n\n@hy.task\ndef workflow_task():\n wf_context = get_hyrex_workflow_context()\n\n # Access workflow arguments\n args = wf_context.workflow_args\n\n # Access other task results\n extract_result = wf_context.durable_runs.get(\"extract_data\")\n if extract_result:\n extract_result.refresh() # Get latest status\n```\n\n### Dynamic Task Configuration\n\nUse `with_config()` to modify task behavior at runtime:\n\n```python\n# Define base task\n@hy.task(queue=\"default\", max_retries=1)\ndef flexible_task(data: str):\n return process(data)\n\n# Override configuration when sending\nflexible_task.with_config(\n queue=\"high-priority\",\n max_retries=5,\n timeout_seconds=60,\n priority=10\n).send(\"important-data\")\n```\n\n### Cron Scheduling\n\nSchedule recurring tasks:\n\n```python\n@hy.task(cron=\"0 2 * * *\") # Daily at 2 AM\ndef daily_cleanup():\n # Cleanup logic\n pass\n\n# Tasks with default arguments can also be scheduled\n@hy.task(cron=\"0 0 * * 0\") # Weekly on Sunday\ndef weekly_backup(retention_days: int = 30):\n # Backup logic with configurable retention\n pass\n\n@hy.workflow(cron=\"0 0 * * 0\") # Weekly on Sunday\ndef weekly_report():\n generate_report >> send_report\n```\n\n**Note**: Cron-scheduled tasks must have no arguments or all arguments must have default values.\n\n### Error Handling\n\nImplement custom error handlers:\n\n```python\ndef handle_task_error(error: Exception):\n # Log error, send alerts, etc.\n print(f\"Task failed: {error}\")\n\n@hy.task(\n on_error=handle_task_error,\n max_retries=3,\n retry_backoff=lambda attempt: 2 ** attempt # Exponential backoff\n)\ndef risky_task():\n # Task that might fail\n pass\n```\n\n## CLI Commands\n\n- `hyrex init` - Interactive project initialization wizard\n- `hyrex init-db` - Initialize the database schema\n- `hyrex run-worker <module:instance>` - Start a worker process\n- `hyrex studio` - Start Hyrex Studio server\n\n## Monitoring with Hyrex Studio\n\nHyrex Studio provides a web interface for monitoring your tasks and workflows:\n\n1. Start the studio server:\n\n ```bash\n hyrex studio\n ```\n\n2. Open https://local.hyrex.studio in your browser\n\n3. Monitor task execution, view logs, and inspect your data\n\n## Configuration\n\nHyrex uses environment variables for configuration:\n\n- `HYREX_DATABASE_URL` - PostgreSQL connection string (required)\n- `STUDIO_PORT` - Port for Hyrex Studio (default: 1337)\n- `STUDIO_VERBOSE` - Enable verbose logging for Studio (default: false)\n\n## Advanced Usage\n\n### Registry Inheritance\n\nShare task definitions across modules:\n\n```python\n# common_tasks.py\ncommon_registry = HyrexRegistry()\n\n@common_registry.task\ndef shared_task():\n pass\n\n# main.py\nfrom common_tasks import common_registry\n\nhy = HyrexRegistry()\nhy.add_registry(common_registry) # Include all tasks from common_registry\n```\n\n### Task Composition\n\nBuild complex task hierarchies:\n\n```python\n@hy.task\ndef parent_task(count: int):\n # Spawn child tasks\n for i in range(count):\n child_task.send(index=i)\n\n # Tasks maintain parent-child relationships\n # visible in context.parent_id and context.root_id\n```\n\n### Idempotency\n\nEnsure tasks run only once:\n\n```python\n@hy.task\ndef idempotent_task(user_id: str):\n # Process user\n pass\n\n# Using idempotency key\nidempotent_task.with_config(\n idempotency_key=f\"process-user-{user_id}\"\n).send(user_id=\"123\")\n```\n\n## Requirements\n\n- Python 3.11+\n- PostgreSQL 12+\n- Required Python packages are automatically installed with pip\n\n## License\n\nApache License 2.0\n\n## Links\n\n- [GitHub Repository](https://github.com/hyrex-labs/hyrex-python)\n- [Documentation](https://github.com/hyrex-labs/hyrex-python)\n- [Hyrex Studio](https://local.hyrex.studio)\n",
"bugtrack_url": null,
"license": null,
"summary": "Hyrex is the open-source COLD task orchestration framework built on Postgres.",
"version": "0.10.8",
"project_urls": {
"Homepage": "https://github.com/hyrex-labs/hyrex-python",
"Source Code": "https://github.com/hyrex-labs/hyrex-python"
},
"split_keywords": [
"hyrex",
" task",
" queue",
" async"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a7a513830f71ab55830b070d39f800a57006e1cb6a207a24a06e7a73c32e99e3",
"md5": "ddfacf4f51d2d0d6029252eba592053e",
"sha256": "0925973cf06140be2cc70772f7ad53a5268501b5d5b5a759d6a7dab20d67fbd1"
},
"downloads": -1,
"filename": "hyrex-0.10.8-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ddfacf4f51d2d0d6029252eba592053e",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 194498,
"upload_time": "2025-07-25T15:53:52",
"upload_time_iso_8601": "2025-07-25T15:53:52.136193Z",
"url": "https://files.pythonhosted.org/packages/a7/a5/13830f71ab55830b070d39f800a57006e1cb6a207a24a06e7a73c32e99e3/hyrex-0.10.8-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "767eaede0787df9dd4b71ffc93d05fbcac669b0f579602d3c64c483ec267266b",
"md5": "1a3d1290aa425259c8201e3f60eb2a65",
"sha256": "54e1a0c7e541d2b402c181e50af74068f9fb16302e8658f3325951102aa873e0"
},
"downloads": -1,
"filename": "hyrex-0.10.8.tar.gz",
"has_sig": false,
"md5_digest": "1a3d1290aa425259c8201e3f60eb2a65",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 120258,
"upload_time": "2025-07-25T15:53:53",
"upload_time_iso_8601": "2025-07-25T15:53:53.329476Z",
"url": "https://files.pythonhosted.org/packages/76/7e/aede0787df9dd4b71ffc93d05fbcac669b0f579602d3c64c483ec267266b/hyrex-0.10.8.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-25 15:53:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "hyrex-labs",
"github_project": "hyrex-python",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "aiohappyeyeballs",
"specs": [
[
"==",
"2.4.3"
]
]
},
{
"name": "aiosignal",
"specs": [
[
"==",
"1.3.1"
]
]
},
{
"name": "annotated-types",
"specs": [
[
"==",
"0.7.0"
]
]
},
{
"name": "anyio",
"specs": [
[
"==",
"4.8.0"
]
]
},
{
"name": "asyncpg",
"specs": [
[
"==",
"0.30.0"
]
]
},
{
"name": "attrs",
"specs": [
[
"==",
"24.2.0"
]
]
},
{
"name": "boto3",
"specs": [
[
"==",
"1.35.98"
]
]
},
{
"name": "botocore",
"specs": [
[
"==",
"1.35.98"
]
]
},
{
"name": "build",
"specs": [
[
"==",
"1.2.2"
]
]
},
{
"name": "certifi",
"specs": [
[
"==",
"2024.8.30"
]
]
},
{
"name": "charset-normalizer",
"specs": [
[
"==",
"3.3.2"
]
]
},
{
"name": "click",
"specs": [
[
"==",
"8.1.7"
]
]
},
{
"name": "colorama",
"specs": [
[
"==",
"0.4.6"
]
]
},
{
"name": "croniter",
"specs": [
[
"==",
"6.0.0"
]
]
},
{
"name": "docutils",
"specs": [
[
"==",
"0.21.2"
]
]
},
{
"name": "dotenv",
"specs": [
[
"==",
"0.9.9"
]
]
},
{
"name": "fastapi",
"specs": [
[
"==",
"0.115.8"
]
]
},
{
"name": "frozenlist",
"specs": [
[
"==",
"1.4.1"
]
]
},
{
"name": "grpcio",
"specs": [
[
"==",
"1.71.0"
]
]
},
{
"name": "grpcio-tools",
"specs": [
[
"==",
"1.71.0"
]
]
},
{
"name": "h11",
"specs": [
[
"==",
"0.16.0"
]
]
},
{
"name": "id",
"specs": [
[
"==",
"1.5.0"
]
]
},
{
"name": "idna",
"specs": [
[
"==",
"3.10"
]
]
},
{
"name": "importlib_metadata",
"specs": [
[
"==",
"8.5.0"
]
]
},
{
"name": "iniconfig",
"specs": [
[
"==",
"2.0.0"
]
]
},
{
"name": "isort",
"specs": [
[
"==",
"5.13.2"
]
]
},
{
"name": "jaraco.classes",
"specs": [
[
"==",
"3.4.0"
]
]
},
{
"name": "jaraco.context",
"specs": [
[
"==",
"6.0.1"
]
]
},
{
"name": "jaraco.functools",
"specs": [
[
"==",
"4.0.2"
]
]
},
{
"name": "jmespath",
"specs": [
[
"==",
"1.0.1"
]
]
},
{
"name": "keyring",
"specs": [
[
"==",
"25.4.1"
]
]
},
{
"name": "markdown-it-py",
"specs": [
[
"==",
"3.0.0"
]
]
},
{
"name": "mdurl",
"specs": [
[
"==",
"0.1.2"
]
]
},
{
"name": "mirakuru",
"specs": [
[
"==",
"2.5.3"
]
]
},
{
"name": "more-itertools",
"specs": [
[
"==",
"10.5.0"
]
]
},
{
"name": "multidict",
"specs": [
[
"==",
"6.1.0"
]
]
},
{
"name": "nh3",
"specs": [
[
"==",
"0.2.18"
]
]
},
{
"name": "packaging",
"specs": [
[
"==",
"24.1"
]
]
},
{
"name": "pkginfo",
"specs": [
[
"==",
"1.12.1.2"
]
]
},
{
"name": "pluggy",
"specs": [
[
"==",
"1.5.0"
]
]
},
{
"name": "port-for",
"specs": [
[
"==",
"0.7.4"
]
]
},
{
"name": "protobuf",
"specs": [
[
"==",
"5.29.5"
]
]
},
{
"name": "psutil",
"specs": [
[
"==",
"6.0.0"
]
]
},
{
"name": "psycopg",
"specs": [
[
"==",
"3.2.6"
]
]
},
{
"name": "psycopg-binary",
"specs": [
[
"==",
"3.2.6"
]
]
},
{
"name": "psycopg-pool",
"specs": [
[
"==",
"3.2.3"
]
]
},
{
"name": "pydantic",
"specs": [
[
"==",
"2.8.2"
]
]
},
{
"name": "pydantic_core",
"specs": [
[
"==",
"2.20.1"
]
]
},
{
"name": "Pygments",
"specs": [
[
"==",
"2.18.0"
]
]
},
{
"name": "pyproject_hooks",
"specs": [
[
"==",
"1.1.0"
]
]
},
{
"name": "pytest",
"specs": [
[
"==",
"8.3.3"
]
]
},
{
"name": "pytest-asyncio",
"specs": [
[
"==",
"0.24.0"
]
]
},
{
"name": "pytest-postgresql",
"specs": [
[
"==",
"6.1.1"
]
]
},
{
"name": "python-dateutil",
"specs": [
[
"==",
"2.9.0.post0"
]
]
},
{
"name": "python-dotenv",
"specs": [
[
"==",
"1.1.0"
]
]
},
{
"name": "pytz",
"specs": [
[
"==",
"2025.1"
]
]
},
{
"name": "readme_renderer",
"specs": [
[
"==",
"44.0"
]
]
},
{
"name": "requests",
"specs": [
[
"==",
"2.32.4"
]
]
},
{
"name": "requests-toolbelt",
"specs": [
[
"==",
"1.0.0"
]
]
},
{
"name": "rfc3986",
"specs": [
[
"==",
"2.0.0"
]
]
},
{
"name": "rich",
"specs": [
[
"==",
"13.8.1"
]
]
},
{
"name": "s3transfer",
"specs": [
[
"==",
"0.10.4"
]
]
},
{
"name": "setuptools",
"specs": [
[
"==",
"80.8.0"
]
]
},
{
"name": "shellingham",
"specs": [
[
"==",
"1.5.4"
]
]
},
{
"name": "six",
"specs": [
[
"==",
"1.17.0"
]
]
},
{
"name": "sniffio",
"specs": [
[
"==",
"1.3.1"
]
]
},
{
"name": "SQLAlchemy",
"specs": [
[
"==",
"2.0.31"
]
]
},
{
"name": "sqlmodel",
"specs": [
[
"==",
"0.0.21"
]
]
},
{
"name": "starlette",
"specs": [
[
"==",
"0.45.3"
]
]
},
{
"name": "stdlib-list",
"specs": [
[
"==",
"0.10.0"
]
]
},
{
"name": "tenacity",
"specs": [
[
"==",
"9.1.2"
]
]
},
{
"name": "twine",
"specs": [
[
"==",
"6.0.1"
]
]
},
{
"name": "typer",
"specs": [
[
"==",
"0.15.3"
]
]
},
{
"name": "typing_extensions",
"specs": [
[
"==",
"4.12.2"
]
]
},
{
"name": "urllib3",
"specs": [
[
"==",
"2.5.0"
]
]
},
{
"name": "uuid6",
"specs": [
[
"==",
"2024.7.10"
]
]
},
{
"name": "uuid7",
"specs": [
[
"==",
"0.1.0"
]
]
},
{
"name": "uvicorn",
"specs": [
[
"==",
"0.34.0"
]
]
},
{
"name": "yarl",
"specs": [
[
"==",
"1.13.1"
]
]
},
{
"name": "zipp",
"specs": [
[
"==",
"3.20.2"
]
]
}
],
"lcname": "hyrex"
}