<div align="center">
<h1>FlowerPower</h1>
<h3>Simple Workflow Framework - Hamilton + APScheduler = FlowerPower</h3>
<img src="./image.png" alt="FlowerPower Logo" width="600" height="400">
</div>
---
## 📚 Table of Contents
1. [Overview](#overview)
2. [Installation](#installation)
3. [Getting Started](#getting-started)
- [Initialize Project](#initialize-project)
- [Add Pipeline](#add-pipeline)
- [Setup Pipeline](#setup-pipeline)
- [Run Pipeline](#run-pipeline)
- [Schedule Pipeline](#schedule-pipeline)
- [Start Worker](#start-worker)
- [Track Pipeline](#track-pipeline)
4. [Development](#development)
- [Dev Services](#dev-services)
---
## 🔍 Overview
FlowerPower is a simple workflow framework based on two fantastic Python libraries:
- **[Hamilton](https://github.com/DAGWorks-Inc/hamilton)**: Creates DAGs from your pipeline functions
- **[APScheduler](https://github.com/agronholm/apscheduler)**: Handles pipeline scheduling
### Key Features
- 🔄 **Pipeline Workflows**: Create and execute complex DAG-based workflows
- ⏰ **Scheduling**: Run pipelines at specific times or intervals
- ⚙️ **Parameterization**: Easily configure pipeline parameters
- 📊 **Tracking**: Monitor executions with Hamilton UI
- 🛠️ **Flexible Configuration**: Simple YAML-based setup
- 📡 **Distributed Execution**: Support for distributed environments
[More details in Hamilton docs](https://hamilton.dagworks.io/en/latest/)
---
## 📦 Installation
```bash
# Basic installation
pip install flowerpower
# With scheduling support
pip install "flowerpower[scheduler]"
# Additional components
pip install "flowerpower[mqtt]" # MQTT broker
pip install "flowerpower[redis]" # Redis broker
pip install "flowerpower[mongodb]" # MongoDB store
pip install "flowerpower[ray]" # Ray computing
pip install "flowerpower[dask]" # Dask computing
pip install "flowerpower[ui]" # Hamilton UI
pip install "flowerpower[websever]" # Web server
```
---
## 🚀 Getting Started
### Initialize Project
**Option 1: Command Line**
```bash
flowerpower init new-project
cd new-project
```
**Option 2: Python**
```python
from flowerpower import init
init("new-project")
```
This creates basic config files:
- `conf/project.yml`
### 📦 Optional: Project Management with UV (Recommended)
It is recommended to use the project manager `uv` to manage your project dependencies.
**Installation**
```bash
pip install uv
```
> For more installation options, visit: https://docs.astral.sh/uv/getting-started/installation/
**Project Initialization**
```bash
uv init --app --no-readme --vcs git
```
---
### Pipeline Management
#### Creating a New Pipeline
**Option 1: Command Line**
```bash
flowerpower new my_flow
```
**Option 2: Python**
```python
# Using PipelineManager
from flowerpower.pipeline import PipelineManager
pm = PipelineManager()
pm.new("my_flow")
# Or using the new function directly
from flowerpower.pipeline import new
new("my_flow")
```
This creates the new pipeline and configuration file:
- `pipelines/my_flow.py`
- `conf/pipelines/my_flow.yml`
#### Setting Up a Pipeline
1. **Add Pipeline Functions**
Build your pipeline by adding the functions (nodes) to `pipelines/my_flow.py` that build the DAG, following the Hamilton paradigm.
2. **Parameterize Functions**
You can parameterize functions in two ways:
**Method 1: Default Values**
```python
def add_int_col(
df: pd.DataFrame,
col_name: str = "foo",
values: str = "bar"
) -> pd.DataFrame:
return df.assign(**{col_name: values})
```
**Method 2: Configuration File**
In `conf/pipelines/my_flow.yml`:
```yaml
...
func:
add_int_col:
col_name: foo
values: bar
...
```
Add the `@parameterize` decorator to the function in your pipeline file:
```python
@parameterize(**PARAMS.add_int_col)
def add_int_col(
df: pd.DataFrame,
col_name: str,
values: int
) -> pd.DataFrame:
return df.assign(**{col_name: values})
```
---
### Running Pipelines
#### Configuration
You can configure the pipeline parameters `inputs`, and `final_vars`, and other parameters in the pipeline
configuration file `conf/pipelines/my_flow.yml` or directly in the pipeline execution function.
#### Using the Pipeline Configuration
```yaml
...
run:
inputs:
data_path: path/to/data.csv
fs_protocol: local
final_vars: [add_int_col, final_df]
# optional parameters
with_tracker: false
executor: threadpool # or processpool, ray, dask
...
```
#### Execution Methods
There are three ways to execute a pipeline:
1. **Direct Execution**
- Runs in current process
- No data store required
2. **Job Execution**
- Runs as APScheduler job
- Returns job results
- Requires data store and event broker
3. **Async Job Addition**
- Adds to APScheduler
- Returns job ID
- Results retrievable from data store
#### Command Line Usage
```bash
# Note: add --inputs and --final-vars and other optional parameters if not specified in the config file
# Direct execution
flowerpower run my_flow
# Job execution
flowerpower run-job my_flow
# Add as scheduled job
flowerpower add-job my_flow
```
You can also use the `--inputs` and `--final-vars` flags to override the configuration file parameters or if they are not specified in the configuration file.
```bash
flowerpower run my_flow \
--inputs data_path=path/to/data.csv,fs_protocol=local \
--final-vars final_df \
--executor threadpool
--without-tracker
```
#### Python Usage
```python
from flowerpower.pipeline import Pipeline, run, run_job, add_job
# Using Pipeline class
p = Pipeline("my_flow")
# Note: add inputs, final_vars, and other optional arguments if not specified in the config file
result = p.run()
result = p.run_job()
job_id = p.add_job()
# Using functions
result = run("my_flow")
result = run_job("my_flow")
job_id = add_job("my_flow")
```
You can also use the `inputs` and `final-vars` arguments to override the configuration file parameters or if they are not specified in the configuration file.
```python
result = run(
"my_flow",
inputs={
"data_path": "path/to/data.csv",
"fs_protocol": "local"
},
final_vars=["final_df"],
executor="threadpool",
with_tracker=False
)
```
---
## ⏰ Scheduling Pipelines
### Setting Up Schedules
#### Command Line Options
```bash
# Run every 30 seconds
flowerpower schedule my_flow \
--type interval \
--interval-params seconds=30
# Run at specific date/time
flowerpower schedule my_flow \
--type date \
--date-params year=2022,month=1,day=1,hour=0,minute=0,second=0
# Run with cron parameters
flowerpower schedule my_flow \
--type cron \
--cron-params second=0,minute=0,hour=0,day=1,month=1,day_of_week=0
# Run with crontab expression
flowerpower schedule my_flow \
--type cron \
--crontab "0 0 1 1 0"
```
#### Python Usage
```python
from flowerpower.scheduler import schedule, Pipeline
# Using Pipeline class
p = Pipeline("my_flow")
p.schedule("interval", seconds=30)
# Using schedule function
schedule("my_flow", "interval", seconds=30)
```
---
## 👷 Worker Management
### Starting a Worker
**Command Line**
```bash
flowerpower start-worker
```
**Python**
```python
# Using the SchedulerManager class
from flowerpower.scheduler import SchedulerManager
sm = SchedulerManager()
sm.start_worker()
# Using the start_worker function
from flowerpower.scheduler import start_worker
start_worker()
```
### Worker Configuration
Configure your worker in `conf/project.yml`:
```yaml
# PostgreSQL Configuration
data_store:
type: postgres
uri: postgresql+asyncpq://user:password@localhost:5432/flowerpower
# Redis Event Broker
event_broker:
type: redis
uri: redis://localhost:6379
# Alternative configuration:
# host: localhost
# port: 6379
```
#### Alternative Data Store Options
**SQLite**
```yaml
data_store:
type: sqlite
uri: sqlite+aiosqlite:///flowerpower.db
```
**MySQL**
```yaml
data_store:
type: mysql
uri: mysql+aiomysql://user:password@localhost:3306/flowerpower
```
**MongoDB**
```yaml
data_store:
type: mongodb
uri: mongodb://localhost:27017/flowerpower
```
**In-Memory**
```yaml
data_store:
type: memory
```
#### Alternative Event Broker Options
**MQTT**
```yaml
event_broker:
type: mqtt
host: localhost
port: 1883
username: user # optional if required
password: supersecret # optional if required
```
**Redis**
```yaml
event_broker:
type: redis
uri: redis://localhost:6379
# Alternative configuration:
# host: localhost
# port: 6379
```
**In-Memory**
```yaml
event_broker:
type: memory
```
---
## 📊 Pipeline Tracking
### Hamilton UI Setup
#### Local Installation
```bash
# Install UI package
pip install "flowerpower[ui]"
# Start UI server
flowerpower hamilton-ui
```
> Access the UI at: http://localhost:8241
#### Docker Installation
```bash
# Clone Hamilton repository
git clone https://github.com/dagworks-inc/hamilton
cd hamilton/ui
# Start UI server
./run.sh
```
> Access the UI at: http://localhost:8242
### Tracker Configuration
Configure tracking in `conf/project.yml`:
```yaml
username: my_email@example.com
api_url: http://localhost:8241
ui_url: http://localhost:8242
api_key: optional_key
```
And specify the `tracker` parameter in the pipeline configuration `conf/pipelines/my_flow.yml:
```yaml
...
tracker:
project_id: 1
tags:
environment: dev
version: 1.0
dag_name: my_flow_123
...
```
---
## 🛠️ Development Services
### Local Development Setup
Download the docker-compose configuration:
```bash
curl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/docker-compose.yml
```
### Starting Services
```bash
# MQTT Broker (EMQX)
docker-compose up mqtt -d
# Redis
docker-compose up redis -d
# MongoDB
docker-compose up mongodb -d
# PostgreSQL
docker-compose up postgres -d
```
---
## 📝 License
[MIT License](LICENSE)
---
## 🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
---
## 📫 Support
For support, please open an issue in the GitHub repository.
Raw data
{
"_id": null,
"home_page": null,
"name": "FlowerPower",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "apscheduler, dask, hamilton, pipeline, ray, scheduler, workflow",
"author": null,
"author_email": "\"Volker L.\" <ligno.blades@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/0a/11/c47333d747663b170dedb11e7dec93ce533c3582e93259b546b3facef07b/flowerpower-0.8.5.6.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n <h1>FlowerPower</h1>\n <h3>Simple Workflow Framework - Hamilton + APScheduler = FlowerPower</h3>\n <img src=\"./image.png\" alt=\"FlowerPower Logo\" width=\"600\" height=\"400\">\n</div>\n\n---\n\n## \ud83d\udcda Table of Contents\n1. [Overview](#overview)\n2. [Installation](#installation)\n3. [Getting Started](#getting-started)\n - [Initialize Project](#initialize-project)\n - [Add Pipeline](#add-pipeline)\n - [Setup Pipeline](#setup-pipeline)\n - [Run Pipeline](#run-pipeline)\n - [Schedule Pipeline](#schedule-pipeline)\n - [Start Worker](#start-worker)\n - [Track Pipeline](#track-pipeline)\n4. [Development](#development)\n - [Dev Services](#dev-services)\n\n---\n\n## \ud83d\udd0d Overview\n\nFlowerPower is a simple workflow framework based on two fantastic Python libraries:\n\n- **[Hamilton](https://github.com/DAGWorks-Inc/hamilton)**: Creates DAGs from your pipeline functions\n- **[APScheduler](https://github.com/agronholm/apscheduler)**: Handles pipeline scheduling\n\n### Key Features\n\n- \ud83d\udd04 **Pipeline Workflows**: Create and execute complex DAG-based workflows\n- \u23f0 **Scheduling**: Run pipelines at specific times or intervals\n- \u2699\ufe0f **Parameterization**: Easily configure pipeline parameters\n- \ud83d\udcca **Tracking**: Monitor executions with Hamilton UI\n- \ud83d\udee0\ufe0f **Flexible Configuration**: Simple YAML-based setup\n- \ud83d\udce1 **Distributed Execution**: Support for distributed environments\n\n[More details in Hamilton docs](https://hamilton.dagworks.io/en/latest/)\n\n---\n\n## \ud83d\udce6 Installation\n\n```bash\n# Basic installation\npip install flowerpower\n\n# With scheduling support\npip install \"flowerpower[scheduler]\"\n\n# Additional components\npip install \"flowerpower[mqtt]\" # MQTT broker\npip install \"flowerpower[redis]\" # Redis broker\npip install \"flowerpower[mongodb]\" # MongoDB store\npip install \"flowerpower[ray]\" # Ray computing\npip install \"flowerpower[dask]\" # Dask computing\npip install \"flowerpower[ui]\" # Hamilton UI\npip install \"flowerpower[websever]\" # Web server\n```\n\n---\n\n## \ud83d\ude80 Getting Started\n\n### Initialize Project\n\n**Option 1: Command Line**\n```bash\nflowerpower init new-project\ncd new-project\n```\n\n**Option 2: Python**\n```python\nfrom flowerpower import init\ninit(\"new-project\")\n```\n\nThis creates basic config files:\n- `conf/project.yml`\n\n\n### \ud83d\udce6 Optional: Project Management with UV (Recommended)\n\nIt is recommended to use the project manager `uv` to manage your project dependencies.\n\n**Installation**\n```bash\npip install uv\n```\n> For more installation options, visit: https://docs.astral.sh/uv/getting-started/installation/\n\n**Project Initialization**\n```bash\nuv init --app --no-readme --vcs git\n```\n---\n\n### Pipeline Management\n\n#### Creating a New Pipeline\n\n**Option 1: Command Line**\n```bash\nflowerpower new my_flow\n```\n\n**Option 2: Python**\n```python\n# Using PipelineManager\nfrom flowerpower.pipeline import PipelineManager\npm = PipelineManager()\npm.new(\"my_flow\")\n\n# Or using the new function directly\nfrom flowerpower.pipeline import new\nnew(\"my_flow\")\n```\n\nThis creates the new pipeline and configuration file:\n- `pipelines/my_flow.py`\n- `conf/pipelines/my_flow.yml`\n\n#### Setting Up a Pipeline\n\n1. **Add Pipeline Functions**\nBuild your pipeline by adding the functions (nodes) to `pipelines/my_flow.py` that build the DAG, following the Hamilton paradigm.\n\n2. **Parameterize Functions**\n\nYou can parameterize functions in two ways:\n\n**Method 1: Default Values**\n```python\ndef add_int_col(\n df: pd.DataFrame,\n col_name: str = \"foo\",\n values: str = \"bar\"\n) -> pd.DataFrame:\n return df.assign(**{col_name: values})\n```\n\n**Method 2: Configuration File**\n\nIn `conf/pipelines/my_flow.yml`:\n```yaml\n...\nfunc:\n add_int_col:\n col_name: foo\n values: bar\n...\n```\n\nAdd the `@parameterize` decorator to the function in your pipeline file:\n```python\n@parameterize(**PARAMS.add_int_col)\ndef add_int_col(\n df: pd.DataFrame,\n col_name: str,\n values: int\n) -> pd.DataFrame:\n return df.assign(**{col_name: values})\n```\n\n---\n\n### Running Pipelines\n\n#### Configuration\n\nYou can configure the pipeline parameters `inputs`, and `final_vars`, and other parameters in the pipeline\nconfiguration file `conf/pipelines/my_flow.yml` or directly in the pipeline execution function.\n\n#### Using the Pipeline Configuration\n```yaml\n...\nrun:\n inputs:\n data_path: path/to/data.csv\n fs_protocol: local\n final_vars: [add_int_col, final_df]\n # optional parameters\n with_tracker: false\n executor: threadpool # or processpool, ray, dask\n...\n```\n\n#### Execution Methods\nThere are three ways to execute a pipeline:\n\n1. **Direct Execution**\n - Runs in current process\n - No data store required\n\n2. **Job Execution**\n - Runs as APScheduler job\n - Returns job results\n - Requires data store and event broker\n\n3. **Async Job Addition**\n - Adds to APScheduler\n - Returns job ID\n - Results retrievable from data store\n\n\n#### Command Line Usage\n```bash\n# Note: add --inputs and --final-vars and other optional parameters if not specified in the config file\n# Direct execution\nflowerpower run my_flow\n# Job execution\nflowerpower run-job my_flow\n\n# Add as scheduled job\nflowerpower add-job my_flow\n```\n\nYou can also use the `--inputs` and `--final-vars` flags to override the configuration file parameters or if they are not specified in the configuration file.\n\n```bash\nflowerpower run my_flow \\\n --inputs data_path=path/to/data.csv,fs_protocol=local \\\n --final-vars final_df \\\n --executor threadpool\n --without-tracker\n```\n\n#### Python Usage\n```python\nfrom flowerpower.pipeline import Pipeline, run, run_job, add_job\n\n# Using Pipeline class\np = Pipeline(\"my_flow\")\n# Note: add inputs, final_vars, and other optional arguments if not specified in the config file\nresult = p.run()\nresult = p.run_job()\njob_id = p.add_job()\n\n# Using functions\nresult = run(\"my_flow\")\nresult = run_job(\"my_flow\")\njob_id = add_job(\"my_flow\")\n```\n\nYou can also use the `inputs` and `final-vars` arguments to override the configuration file parameters or if they are not specified in the configuration file.\n\n```python\nresult = run(\n \"my_flow\",\n inputs={\n \"data_path\": \"path/to/data.csv\",\n \"fs_protocol\": \"local\"\n },\n final_vars=[\"final_df\"],\n executor=\"threadpool\",\n with_tracker=False\n)\n```\n\n---\n## \u23f0 Scheduling Pipelines\n\n### Setting Up Schedules\n\n#### Command Line Options\n\n```bash\n# Run every 30 seconds\nflowerpower schedule my_flow \\\n --type interval \\\n --interval-params seconds=30\n\n# Run at specific date/time\nflowerpower schedule my_flow \\\n --type date \\\n --date-params year=2022,month=1,day=1,hour=0,minute=0,second=0\n\n# Run with cron parameters\nflowerpower schedule my_flow \\\n --type cron \\\n --cron-params second=0,minute=0,hour=0,day=1,month=1,day_of_week=0\n\n# Run with crontab expression\nflowerpower schedule my_flow \\\n --type cron \\\n --crontab \"0 0 1 1 0\"\n```\n\n#### Python Usage\n```python\nfrom flowerpower.scheduler import schedule, Pipeline\n\n# Using Pipeline class\np = Pipeline(\"my_flow\")\np.schedule(\"interval\", seconds=30)\n\n# Using schedule function\nschedule(\"my_flow\", \"interval\", seconds=30)\n```\n\n---\n\n## \ud83d\udc77 Worker Management\n\n### Starting a Worker\n\n**Command Line**\n```bash\nflowerpower start-worker\n```\n\n**Python**\n```python\n# Using the SchedulerManager class\nfrom flowerpower.scheduler import SchedulerManager\nsm = SchedulerManager()\nsm.start_worker()\n\n# Using the start_worker function\nfrom flowerpower.scheduler import start_worker\nstart_worker()\n```\n\n### Worker Configuration\n\nConfigure your worker in `conf/project.yml`:\n\n```yaml\n# PostgreSQL Configuration\ndata_store:\n type: postgres\n uri: postgresql+asyncpq://user:password@localhost:5432/flowerpower\n\n# Redis Event Broker\nevent_broker:\n type: redis\n uri: redis://localhost:6379\n # Alternative configuration:\n # host: localhost\n # port: 6379\n```\n\n#### Alternative Data Store Options\n\n**SQLite**\n```yaml\ndata_store:\n type: sqlite\n uri: sqlite+aiosqlite:///flowerpower.db\n```\n\n**MySQL**\n```yaml\ndata_store:\n type: mysql\n uri: mysql+aiomysql://user:password@localhost:3306/flowerpower\n```\n\n**MongoDB**\n```yaml\ndata_store:\n type: mongodb\n uri: mongodb://localhost:27017/flowerpower\n```\n\n**In-Memory**\n```yaml\ndata_store:\n type: memory\n```\n\n#### Alternative Event Broker Options\n\n**MQTT**\n```yaml\nevent_broker:\n type: mqtt\n host: localhost\n port: 1883\n username: user # optional if required\n password: supersecret # optional if required\n```\n**Redis**\n```yaml\nevent_broker:\n type: redis\n uri: redis://localhost:6379\n # Alternative configuration:\n # host: localhost\n # port: 6379\n```\n\n**In-Memory**\n```yaml\nevent_broker:\n type: memory\n```\n\n---\n\n## \ud83d\udcca Pipeline Tracking\n\n### Hamilton UI Setup\n\n#### Local Installation\n```bash\n# Install UI package\npip install \"flowerpower[ui]\"\n\n# Start UI server\nflowerpower hamilton-ui\n```\n> Access the UI at: http://localhost:8241\n\n#### Docker Installation\n```bash\n# Clone Hamilton repository\ngit clone https://github.com/dagworks-inc/hamilton\ncd hamilton/ui\n\n# Start UI server\n./run.sh\n```\n> Access the UI at: http://localhost:8242\n\n### Tracker Configuration\n\nConfigure tracking in `conf/project.yml`:\n\n```yaml\nusername: my_email@example.com\napi_url: http://localhost:8241\nui_url: http://localhost:8242\napi_key: optional_key\n```\n\nAnd specify the `tracker` parameter in the pipeline configuration `conf/pipelines/my_flow.yml:\n\n```yaml\n...\ntracker:\n project_id: 1\n tags:\n environment: dev\n version: 1.0\n dag_name: my_flow_123\n...\n```\n\n---\n\n## \ud83d\udee0\ufe0f Development Services\n\n### Local Development Setup\n\nDownload the docker-compose configuration:\n```bash\ncurl -O https://raw.githubusercontent.com/legout/flowerpower/main/docker/docker-compose.yml\n```\n\n### Starting Services\n\n```bash\n# MQTT Broker (EMQX)\ndocker-compose up mqtt -d\n\n# Redis\ndocker-compose up redis -d\n\n# MongoDB\ndocker-compose up mongodb -d\n\n# PostgreSQL\ndocker-compose up postgres -d\n```\n\n---\n\n## \ud83d\udcdd License\n\n[MIT License](LICENSE)\n\n---\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n---\n\n## \ud83d\udceb Support\n\nFor support, please open an issue in the GitHub repository.\n",
"bugtrack_url": null,
"license": null,
"summary": "A simple workflow framework. Hamilton + APScheduler = FlowerPower",
"version": "0.8.5.6",
"project_urls": null,
"split_keywords": [
"apscheduler",
" dask",
" hamilton",
" pipeline",
" ray",
" scheduler",
" workflow"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "96bc08466d736e8d62896744d68cebacf0e90d21ed2440508c833b735ffa0cf2",
"md5": "ec19046c8d7f9db02332829aa5c5321d",
"sha256": "5105a19a8b1718f5cac08d3f6c062cfef0f0111e90c0cec29c104d6b2eadb9c4"
},
"downloads": -1,
"filename": "flowerpower-0.8.5.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ec19046c8d7f9db02332829aa5c5321d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 91675,
"upload_time": "2024-12-16T10:21:37",
"upload_time_iso_8601": "2024-12-16T10:21:37.579347Z",
"url": "https://files.pythonhosted.org/packages/96/bc/08466d736e8d62896744d68cebacf0e90d21ed2440508c833b735ffa0cf2/flowerpower-0.8.5.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "0a11c47333d747663b170dedb11e7dec93ce533c3582e93259b546b3facef07b",
"md5": "f22ec2f7b24bcd3ab75466ede2dc9a53",
"sha256": "6df73900f3347b1e780bb50f9da23cde84a92c6eab793e88a1ee0000c9fdf03a"
},
"downloads": -1,
"filename": "flowerpower-0.8.5.6.tar.gz",
"has_sig": false,
"md5_digest": "f22ec2f7b24bcd3ab75466ede2dc9a53",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 2338943,
"upload_time": "2024-12-16T10:21:40",
"upload_time_iso_8601": "2024-12-16T10:21:40.332550Z",
"url": "https://files.pythonhosted.org/packages/0a/11/c47333d747663b170dedb11e7dec93ce533c3582e93259b546b3facef07b/flowerpower-0.8.5.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-16 10:21:40",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "flowerpower"
}