pymfpatch


Namepymfpatch JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://gitlab.com/arep-dev/pymfpatch.git
SummaryFill missing weather measurements using user-provided ERA5 data and a hybrid GRU + XGBoost approach
upload_time2025-08-04 08:45:40
maintainerNone
docs_urlNone
authorAlexis SAUVAGEON
requires_python>=3.7
licenseNone
keywords weather météo-france era5 encoder-decoder machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # pymfpatch

A modular, end-to-end pipeline for gap-filling EnergyPlus EPW weather files using high-resolution ERA reanalysis (or other reference series) and a hybrid GRU + XGBoost approach.  
It automatically parses, cleans, feature-engineers, imputes missing meteorological variables, and writes out a fully compliant EPW-ready for building simulations.

------------------------------------------------------------------------

## Features

-   **Dual-stage imputation**
    -   **GRU** for sequential, multi-output regression on variables with temporal patterns (temperature, humidity, wind, radiation...)
    -   **XGBoost** for the remaining scalar variables (visibility, ceiling height, albedo, precipitation...)\
-   **Flexible configuration** of
    -   which variables to impute with GRU vs XGB\
    -   model hyperparameters (learning rate, depth, batch size, early stopping, ...)\
-   **EnergyPlus EPW input/output** with native header preservation\
-   **Utility functions** for parsing, cleaning, feature-engineering and final EPW formatting

------------------------------------------------------------------------

## Installation

### 1. Editable install from local source
``` bash
# clone & install in editable mode
git clone https://your-repo.org/your-org/pymfpatch.git
cd pymfpatch
pip install -e .
```

### 2. Install from PyPI
``` bash
pip install pymfpatch
```

### 3. GPU enabling
For fastest training/inference, run on a CUDA-enabled GPU.
By default pip install pymfpatch will pull in the CPU-only torch wheel.
If you have Python 3.12 and a compatible GPU, for example, you can upgrade to the GPU build with:
``` bash
pip install torch==2.7.1+cu128 --index-url https://download.pytorch.org/whl/cu128
```

## Quickstart

``` python
from pymfpatch import WeatherImputer
# 1) Instantiate the imputer with your reference (ERA) and target station EPWs:
imputer = WeatherImputer(
    path_to_ref = 'Data/ERA/marignane-era.epw',
    path_to_stn   = 'Data/MF/marignane-mf.epw',
)

# 2) Run the imputation pipeline:
imputer.process()

# 3) Write out a fully-imputed EPW:
imputer.write("marignane-imputed.epw")
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://gitlab.com/arep-dev/pymfpatch.git",
    "name": "pymfpatch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "weather, M\u00e9t\u00e9o-France, ERA5, encoder-decoder, machine learning",
    "author": "Alexis SAUVAGEON",
    "author_email": "Your Name <your.email@example.com>",
    "download_url": "https://files.pythonhosted.org/packages/ce/65/36f909759e7f76c979e4c1404ff59cb263ac0bc47df49adbb8260c5955fd/pymfpatch-1.0.0.tar.gz",
    "platform": null,
    "description": "# pymfpatch\n\nA modular, end-to-end pipeline for gap-filling EnergyPlus EPW weather files using high-resolution ERA reanalysis (or other reference series) and a hybrid GRU + XGBoost approach.  \nIt automatically parses, cleans, feature-engineers, imputes missing meteorological variables, and writes out a fully compliant EPW-ready for building simulations.\n\n------------------------------------------------------------------------\n\n## Features\n\n-   **Dual-stage imputation**\n    -   **GRU** for sequential, multi-output regression on variables with temporal patterns (temperature, humidity, wind, radiation...)\n    -   **XGBoost** for the remaining scalar variables (visibility, ceiling height, albedo, precipitation...)\\\n-   **Flexible configuration** of\n    -   which variables to impute with GRU vs XGB\\\n    -   model hyperparameters (learning rate, depth, batch size, early stopping, ...)\\\n-   **EnergyPlus EPW input/output** with native header preservation\\\n-   **Utility functions** for parsing, cleaning, feature-engineering and final EPW formatting\n\n------------------------------------------------------------------------\n\n## Installation\n\n### 1. Editable install from local source\n``` bash\n# clone & install in editable mode\ngit clone https://your-repo.org/your-org/pymfpatch.git\ncd pymfpatch\npip install -e .\n```\n\n### 2. Install from PyPI\n``` bash\npip install pymfpatch\n```\n\n### 3. GPU enabling\nFor fastest training/inference, run on a CUDA-enabled GPU.\nBy default pip install pymfpatch will pull in the CPU-only torch wheel.\nIf you have Python 3.12 and a compatible GPU, for example, you can upgrade to the GPU build with:\n``` bash\npip install torch==2.7.1+cu128 --index-url https://download.pytorch.org/whl/cu128\n```\n\n## Quickstart\n\n``` python\nfrom pymfpatch import WeatherImputer\n# 1) Instantiate the imputer with your reference (ERA) and target station EPWs:\nimputer = WeatherImputer(\n    path_to_ref = 'Data/ERA/marignane-era.epw',\n    path_to_stn   = 'Data/MF/marignane-mf.epw',\n)\n\n# 2) Run the imputation pipeline:\nimputer.process()\n\n# 3) Write out a fully-imputed EPW:\nimputer.write(\"marignane-imputed.epw\")\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Fill missing weather measurements using user-provided ERA5 data and a hybrid GRU + XGBoost approach",
    "version": "1.0.0",
    "project_urls": {
        "Homepage": "https://gitlab.com/arep-dev/pymfpatch.git"
    },
    "split_keywords": [
        "weather",
        " m\u00e9t\u00e9o-france",
        " era5",
        " encoder-decoder",
        " machine learning"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "9e6ff1699b2dfed5e90967329352733df9aa494e930d868d22451898feeb4237",
                "md5": "01885fda8e257e672930277c6f347d80",
                "sha256": "efa73ff0ee367e1fd2971b6ab614e9d4a1259fdb621be574cb3a5749dbcda22f"
            },
            "downloads": -1,
            "filename": "pymfpatch-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "01885fda8e257e672930277c6f347d80",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 16025,
            "upload_time": "2025-08-04T08:45:38",
            "upload_time_iso_8601": "2025-08-04T08:45:38.787080Z",
            "url": "https://files.pythonhosted.org/packages/9e/6f/f1699b2dfed5e90967329352733df9aa494e930d868d22451898feeb4237/pymfpatch-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ce6536f909759e7f76c979e4c1404ff59cb263ac0bc47df49adbb8260c5955fd",
                "md5": "6869f41f006c34bde695a24a08eaf118",
                "sha256": "edaf4fd1afe55b8bec97966eb315a05002959b414d070bfc012788198364c6df"
            },
            "downloads": -1,
            "filename": "pymfpatch-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "6869f41f006c34bde695a24a08eaf118",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 15601,
            "upload_time": "2025-08-04T08:45:40",
            "upload_time_iso_8601": "2025-08-04T08:45:40.022676Z",
            "url": "https://files.pythonhosted.org/packages/ce/65/36f909759e7f76c979e4c1404ff59cb263ac0bc47df49adbb8260c5955fd/pymfpatch-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-04 08:45:40",
    "github": false,
    "gitlab": true,
    "bitbucket": false,
    "codeberg": false,
    "gitlab_user": "arep-dev",
    "gitlab_project": "pymfpatch",
    "lcname": "pymfpatch"
}
        
Elapsed time: 1.33304s