boat-jit


Nameboat-jit JSON
Version 1.0.3 PyPI version JSON
download
home_pagehttps://github.com/callous-youth/BOAT/tree/boat_jit
SummaryA Bilevel Optimization Toolkit in Python for Learning and Vision Tasks Based on Jittor
upload_time2025-10-20 11:54:42
maintainerNone
docs_urlNone
authorYaohua Liu, Xianghao Jiao, Risheng Liu
requires_python>=3.8.0
licenseMIT
keywords bilevel-optimization learning and vision python deep learning jittor
VCS
bugtrack_url
requirements numpy setuptools torch higher matplotlib
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# BOAT --- Task-Agnostic Operation Toolbox for Gradient-based Bilevel Optimization
[![PyPI version](https://badge.fury.io/py/boml.svg)](https://badge.fury.io/py/boml)
![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/callous-youth/BOAT/workflow.yml)
[![codecov](https://codecov.io/github/callous-youth/BOAT/graph/badge.svg?token=0MKAOQ9KL3)](https://codecov.io/github/callous-youth/BOAT)
![GitHub commit activity](https://img.shields.io/github/commit-activity/w/callous-youth/BOAT)
[![pages-build-deployment](https://github.com/callous-youth/BOAT/actions/workflows/pages/pages-build-deployment/badge.svg)](https://github.com/callous-youth/BOAT/actions/workflows/pages/pages-build-deployment)
![GitHub top language](https://img.shields.io/github/languages/top/callous-youth/BOAT)
![GitHub language count](https://img.shields.io/github/languages/count/callous-youth/BOAT)
![Python version](https://img.shields.io/pypi/pyversions/boml)
![license](https://img.shields.io/badge/license-MIT-000000.svg)
![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)

**BOAT** is a task-agnostic, gradient-based **Bi-Level Optimization (BLO)** Python library that focuses on abstracting the key BLO process into modular, flexible components. It enables researchers and developers to tackle learning tasks with hierarchical nested nature by providing customizable and diverse operator decomposition, encapsulation, and combination. BOAT supports specialized optimization strategies, including second-order or first-order, nested or non-nested, and with or without theoretical guarantees, catering to various levels of complexity.

To enhance flexibility and efficiency, BOAT incorporates the **Dynamic Operation Library (D-OL)** and the **Hyper Operation Library (H-OL)**, alongside a collection of state-of-the-art first-order optimization strategies. BOAT also provides multiple implementation versions:
- **[PyTorch-based](https://github.com/callous-youth/BOAT)**: An efficient and widely-used version.
- **[Jittor-based](https://github.com/callous-youth/BOAT/tree/boat_jit)**: An accelerated version for high-performance tasks.
- **[MindSpore-based](https://github.com/callous-youth/BOAT/tree/boat_ms)**: Incorporating the latest first-order optimization strategies to support emerging application scenarios.


BOAT is designed to offer robust computational support for a broad spectrum of BLO research and applications, enabling innovation and efficiency in machine learning and computer vision.


## 🔑  **Key Features**
- **Dynamic Operation Library (D-OL)**: Incorporates 4 advanced dynamic system construction operations, enabling users to flexibly tailor optimization trajectories for BLO tasks.
- **Hyper-Gradient Operation Library (H-OL)**: Provides 9 refined operations for hyper-gradient computation, significantly enhancing the precision and efficiency of gradient-based BLO methods.
- **First-Order Gradient Methods (FOGMs)**: Integrates 4 state-of-the-art first-order methods, enabling fast prototyping and validation of new BLO algorithms. With modularized design, BOAT allows flexible combinations of multiple upper-level and lower-level operators, resulting in over **63+16** (nearly 80) algorithmic combinations, offering unparalleled adaptability.
- **Modularized Design for Customization**: Empowers users to flexibly combine dynamic and hyper-gradient operations while customizing the specific forms of problems, parameters, and optimizer choices, enabling seamless integration into diverse task-specific codes.
- **Comprehensive Testing & Continuous Integration**: Achieves **99% code coverage** through rigorous testing with **pytest** and **Codecov**, coupled with continuous integration via **GitHub Actions**, ensuring software robustness and reliability.
- **Fast Prototyping & Algorithm Validation**: Streamlined support for defining, testing, and benchmarking new BLO algorithms.
- **Unified Computational Analysis**: Offers a comprehensive complexity analysis of gradient-based BLO techniques to guide users in selecting optimal configurations for efficiency and accuracy.
- **Detailed Documentation & Community Support**: Offers thorough documentation with practical examples and API references via **MkDocs**, ensuring accessibility and ease of use for both novice and advanced users.

##  🚀 **Why BOAT?**
Existing automatic differentiation (AD) tools primarily focus on specific and fundamental optimization strategies, such as explicit or implicit methods, and are often targeted at meta-learning or specific application scenarios, lacking support for algorithm customization. 

In contrast, **BOAT** expands the landscape of Bi-Level Optimization (BLO) applications by supporting a broader range of problem-adaptive operations. It bridges the gap between theoretical research and practical deployment, offering unparalleled flexibility to design, customize, and accelerate BLO techniques.


##  🏭 **Applications**
BOAT enables efficient implementation and adaptation of advanced BLO techniques for key applications, including but not limited to:
- **Hyperparameter Optimization (HO)**
- **Neural Architecture Search (NAS)**
- **Adversarial Training (AT)**
- **Few-Shot Learning (FSL)**
- **Medical Image Analysis (MIA)**
- **Generative Adversarial Learning**
- **Transfer Attack**
- ...

##  🔨 **Installation**
To install BOAT, use the following command:
```bash
pip install boat-jit 
or 
git clone -b boat_jit --single-branch https://github.com/callous-youth/BOAT.git
pip install -e .
```

##  ⚡ **How to Use BOAT**

### **1. Load Configuration Files**
BOAT relies on two key configuration files:
- `boat_config.json`: Specifies optimization strategies and dynamic/hyper-gradient operations.
- `loss_config.json`: Defines the loss functions for both levels of the BLO process.

```python
import os
import json
import boat_jit as boat

# Load configuration files
with open("path_to_configs/boat_config.json", "r") as f:
    boat_config = json.load(f)

with open("path_to_configs/loss_config.json", "r") as f:
    loss_config = json.load(f)
```

### **2. Define Models and Optimizers**
You need to specify both the upper-level and lower-level models along with their respective optimizers.

```python
import jittor as jit

# Define models
upper_model = UpperModel(*args, **kwargs)  # Replace with your upper-level model
lower_model = LowerModel(*args, **kwargs)  # Replace with your lower-level model

# Define optimizers
upper_opt = jit.nn.Adam(upper_model.parameters(), lr=0.01)
lower_opt = jit.nn.SGD(lower_model.parameters(), lr=0.01)
```

### **3. Customize BOAT Configuration**
Modify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.

```python
# Example dynamic and hyper-gradient methods Combination.
dynamic_method = ["NGD","DI", "GDA"]  # Dynamic Methods (Demo Only)
hyper_method = ["RGT","RAD"]          # Hyper-Gradient Methods (Demo Only)

# Add methods and model details to the configuration
boat_config["dynamic_op"] = dynamic_method
boat_config["hyper_op"] = hyper_method
boat_config["lower_level_model"] = lower_model
boat_config["upper_level_model"] = upper_model
boat_config["lower_level_var"] = lower_model.parameters()
boat_config["upper_level_var"] = upper_model.parameters()
```

### **4. Initialize the BOAT Problem**
Modify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.

```python
# Initialize the problem
b_optimizer = boat.Problem(boat_config, loss_config)

# Build solvers for lower and upper levels
b_optimizer.build_ll_solver(lower_opt)  # Lower-level solver
b_optimizer.build_ul_solver(upper_opt)  # Upper-level solver
```

### **5. Define Data Feeds**
Prepare the data feeds for both levels of the BLO process, which was further fed into the the upper-level  and lower-level objective functions. 

```python
# Define data feeds (Demo Only)
ul_feed_dict = {"data": upper_level_data, "target": upper_level_target}
ll_feed_dict = {"data": lower_level_data, "target": lower_level_target}
```

### **6. Run the Optimization Loop**
Execute the optimization loop, optionally customizing the solver strategy for dynamic methods.

```python
# Set number of iterations
iterations = 1000

# Optimization loop (Demo Only)
for x_itr in range(iterations):
    # Run a single optimization iteration
    loss, run_time = b_optimizer.run_iter(ll_feed_dict, ul_feed_dict, current_iter=x_itr)

```



## Related Methods

- [Hyperparameter optimization with approximate gradient (CG)](https://arxiv.org/abs/1602.02355)
- [Optimizing millions of hyperparameters by implicit differentiation (NS)](http://proceedings.mlr.press/v108/lorraine20a/lorraine20a.pdf)
- [Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks (IAD)](https://arxiv.org/abs/1703.03400)
- [On First-Order Meta-Learning Algorithms (FOA)](https://arxiv.org/abs/1703.03400)
- [Bilevel Programming for Hyperparameter Optimization and Meta-Learning (RAD)](http://export.arxiv.org/pdf/1806.04910)
- [Truncated Back-propagation for Bilevel Optimization (RGT)](https://arxiv.org/pdf/1810.10667.pdf)
- [DARTS: Differentiable Architecture Search (FD)](https://arxiv.org/pdf/1806.09055.pdf)
- [A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton (GDA)](https://arxiv.org/pdf/2006.04045.pdf)
- [Towards gradient-based bilevel optimization with non-convex followers and beyond (PTT, DI)](https://proceedings.neurips.cc/paper_files/paper/2021/file/48bea99c85bcbaaba618ba10a6f69e44-Paper.pdf)
- [Averaged Method of Multipliers for Bi-Level Optimization without Lower-Level Strong Convexity(DM)](https://proceedings.mlr.press/v202/liu23y/liu23y.pdf)
- [Learning With Constraint Learning: New Perspective, Solution Strategy and Various Applications (IGA)](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10430445)
- [BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach (VFM)](https://proceedings.neurips.cc/paper_files/paper/2022/file/6dddcff5b115b40c998a08fbd1cea4d7-Paper-Conference.pdf)
- [A Value-Function-based Interior-point Method for Non-convex Bi-level Optimization (VSM)](http://proceedings.mlr.press/v139/liu21o/liu21o.pdf)
- [On Penalty-based Bilevel Gradient Descent Method (PGDM)](https://proceedings.mlr.press/v202/shen23c/shen23c.pdf)
- [Moreau Envelope for Nonconvex Bi-Level Optimization: A Single-loop and Hessian-free Solution Strategy (MESM)](https://arxiv.org/pdf/2405.09927)


## License

MIT License

Copyright (c) 2024 Yaohua Liu

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.




            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/callous-youth/BOAT/tree/boat_jit",
    "name": "boat-jit",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8.0",
    "maintainer_email": null,
    "keywords": "Bilevel-optimization, Learning and vision, Python, Deep learning, Jittor",
    "author": "Yaohua Liu, Xianghao Jiao, Risheng Liu",
    "author_email": "liuyaohua.918@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/bd/fd/b52bd7b1f14e65b12b69556fa4fc36800a91003e4ebcc106f06c5aa6c9fd/boat_jit-1.0.3.tar.gz",
    "platform": null,
    "description": "\n# BOAT --- Task-Agnostic Operation Toolbox for Gradient-based Bilevel Optimization\n[![PyPI version](https://badge.fury.io/py/boml.svg)](https://badge.fury.io/py/boml)\n![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/callous-youth/BOAT/workflow.yml)\n[![codecov](https://codecov.io/github/callous-youth/BOAT/graph/badge.svg?token=0MKAOQ9KL3)](https://codecov.io/github/callous-youth/BOAT)\n![GitHub commit activity](https://img.shields.io/github/commit-activity/w/callous-youth/BOAT)\n[![pages-build-deployment](https://github.com/callous-youth/BOAT/actions/workflows/pages/pages-build-deployment/badge.svg)](https://github.com/callous-youth/BOAT/actions/workflows/pages/pages-build-deployment)\n![GitHub top language](https://img.shields.io/github/languages/top/callous-youth/BOAT)\n![GitHub language count](https://img.shields.io/github/languages/count/callous-youth/BOAT)\n![Python version](https://img.shields.io/pypi/pyversions/boml)\n![license](https://img.shields.io/badge/license-MIT-000000.svg)\n![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)\n\n**BOAT** is a task-agnostic, gradient-based **Bi-Level Optimization (BLO)** Python library that focuses on abstracting the key BLO process into modular, flexible components. It enables researchers and developers to tackle learning tasks with hierarchical nested nature by providing customizable and diverse operator decomposition, encapsulation, and combination. BOAT supports specialized optimization strategies, including second-order or first-order, nested or non-nested, and with or without theoretical guarantees, catering to various levels of complexity.\n\nTo enhance flexibility and efficiency, BOAT incorporates the **Dynamic Operation Library (D-OL)** and the **Hyper Operation Library (H-OL)**, alongside a collection of state-of-the-art first-order optimization strategies. BOAT also provides multiple implementation versions:\n- **[PyTorch-based](https://github.com/callous-youth/BOAT)**: An efficient and widely-used version.\n- **[Jittor-based](https://github.com/callous-youth/BOAT/tree/boat_jit)**: An accelerated version for high-performance tasks.\n- **[MindSpore-based](https://github.com/callous-youth/BOAT/tree/boat_ms)**: Incorporating the latest first-order optimization strategies to support emerging application scenarios.\n\n\nBOAT is designed to offer robust computational support for a broad spectrum of BLO research and applications, enabling innovation and efficiency in machine learning and computer vision.\n\n\n## \ud83d\udd11  **Key Features**\n- **Dynamic Operation Library (D-OL)**: Incorporates 4 advanced dynamic system construction operations, enabling users to flexibly tailor optimization trajectories for BLO tasks.\n- **Hyper-Gradient Operation Library (H-OL)**: Provides 9 refined operations for hyper-gradient computation, significantly enhancing the precision and efficiency of gradient-based BLO methods.\n- **First-Order Gradient Methods (FOGMs)**: Integrates 4 state-of-the-art first-order methods, enabling fast prototyping and validation of new BLO algorithms. With modularized design, BOAT allows flexible combinations of multiple upper-level and lower-level operators, resulting in over **63+16** (nearly 80) algorithmic combinations, offering unparalleled adaptability.\n- **Modularized Design for Customization**: Empowers users to flexibly combine dynamic and hyper-gradient operations while customizing the specific forms of problems, parameters, and optimizer choices, enabling seamless integration into diverse task-specific codes.\n- **Comprehensive Testing & Continuous Integration**: Achieves **99% code coverage** through rigorous testing with **pytest** and **Codecov**, coupled with continuous integration via **GitHub Actions**, ensuring software robustness and reliability.\n- **Fast Prototyping & Algorithm Validation**: Streamlined support for defining, testing, and benchmarking new BLO algorithms.\n- **Unified Computational Analysis**: Offers a comprehensive complexity analysis of gradient-based BLO techniques to guide users in selecting optimal configurations for efficiency and accuracy.\n- **Detailed Documentation & Community Support**: Offers thorough documentation with practical examples and API references via **MkDocs**, ensuring accessibility and ease of use for both novice and advanced users.\n\n##  \ud83d\ude80 **Why BOAT?**\nExisting automatic differentiation (AD) tools primarily focus on specific and fundamental optimization strategies, such as explicit or implicit methods, and are often targeted at meta-learning or specific application scenarios, lacking support for algorithm customization. \n\nIn contrast, **BOAT** expands the landscape of Bi-Level Optimization (BLO) applications by supporting a broader range of problem-adaptive operations. It bridges the gap between theoretical research and practical deployment, offering unparalleled flexibility to design, customize, and accelerate BLO techniques.\n\n\n##  \ud83c\udfed **Applications**\nBOAT enables efficient implementation and adaptation of advanced BLO techniques for key applications, including but not limited to:\n- **Hyperparameter Optimization (HO)**\n- **Neural Architecture Search (NAS)**\n- **Adversarial Training (AT)**\n- **Few-Shot Learning (FSL)**\n- **Medical Image Analysis (MIA)**\n- **Generative Adversarial Learning**\n- **Transfer Attack**\n- ...\n\n##  \ud83d\udd28 **Installation**\nTo install BOAT, use the following command:\n```bash\npip install boat-jit \nor \ngit clone -b boat_jit --single-branch https://github.com/callous-youth/BOAT.git\npip install -e .\n```\n\n##  \u26a1 **How to Use BOAT**\n\n### **1. Load Configuration Files**\nBOAT relies on two key configuration files:\n- `boat_config.json`: Specifies optimization strategies and dynamic/hyper-gradient operations.\n- `loss_config.json`: Defines the loss functions for both levels of the BLO process.\n\n```python\nimport os\nimport json\nimport boat_jit as boat\n\n# Load configuration files\nwith open(\"path_to_configs/boat_config.json\", \"r\") as f:\n    boat_config = json.load(f)\n\nwith open(\"path_to_configs/loss_config.json\", \"r\") as f:\n    loss_config = json.load(f)\n```\n\n### **2. Define Models and Optimizers**\nYou need to specify both the upper-level and lower-level models along with their respective optimizers.\n\n```python\nimport jittor as jit\n\n# Define models\nupper_model = UpperModel(*args, **kwargs)  # Replace with your upper-level model\nlower_model = LowerModel(*args, **kwargs)  # Replace with your lower-level model\n\n# Define optimizers\nupper_opt = jit.nn.Adam(upper_model.parameters(), lr=0.01)\nlower_opt = jit.nn.SGD(lower_model.parameters(), lr=0.01)\n```\n\n### **3. Customize BOAT Configuration**\nModify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.\n\n```python\n# Example dynamic and hyper-gradient methods Combination.\ndynamic_method = [\"NGD\",\"DI\", \"GDA\"]  # Dynamic Methods (Demo Only)\nhyper_method = [\"RGT\",\"RAD\"]          # Hyper-Gradient Methods (Demo Only)\n\n# Add methods and model details to the configuration\nboat_config[\"dynamic_op\"] = dynamic_method\nboat_config[\"hyper_op\"] = hyper_method\nboat_config[\"lower_level_model\"] = lower_model\nboat_config[\"upper_level_model\"] = upper_model\nboat_config[\"lower_level_var\"] = lower_model.parameters()\nboat_config[\"upper_level_var\"] = upper_model.parameters()\n```\n\n### **4. Initialize the BOAT Problem**\nModify the boat_config to include your dynamic and hyper-gradient methods, as well as model and variable details.\n\n```python\n# Initialize the problem\nb_optimizer = boat.Problem(boat_config, loss_config)\n\n# Build solvers for lower and upper levels\nb_optimizer.build_ll_solver(lower_opt)  # Lower-level solver\nb_optimizer.build_ul_solver(upper_opt)  # Upper-level solver\n```\n\n### **5. Define Data Feeds**\nPrepare the data feeds for both levels of the BLO process, which was further fed into the the upper-level  and lower-level objective functions. \n\n```python\n# Define data feeds (Demo Only)\nul_feed_dict = {\"data\": upper_level_data, \"target\": upper_level_target}\nll_feed_dict = {\"data\": lower_level_data, \"target\": lower_level_target}\n```\n\n### **6. Run the Optimization Loop**\nExecute the optimization loop, optionally customizing the solver strategy for dynamic methods.\n\n```python\n# Set number of iterations\niterations = 1000\n\n# Optimization loop (Demo Only)\nfor x_itr in range(iterations):\n    # Run a single optimization iteration\n    loss, run_time = b_optimizer.run_iter(ll_feed_dict, ul_feed_dict, current_iter=x_itr)\n\n```\n\n\n\n## Related Methods\n\n- [Hyperparameter optimization with approximate gradient (CG)](https://arxiv.org/abs/1602.02355)\n- [Optimizing millions of hyperparameters by implicit differentiation (NS)](http://proceedings.mlr.press/v108/lorraine20a/lorraine20a.pdf)\n- [Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks (IAD)](https://arxiv.org/abs/1703.03400)\n- [On First-Order Meta-Learning Algorithms (FOA)](https://arxiv.org/abs/1703.03400)\n- [Bilevel Programming for Hyperparameter Optimization and Meta-Learning (RAD)](http://export.arxiv.org/pdf/1806.04910)\n- [Truncated Back-propagation for Bilevel Optimization (RGT)](https://arxiv.org/pdf/1810.10667.pdf)\n- [DARTS: Differentiable Architecture Search (FD)](https://arxiv.org/pdf/1806.09055.pdf)\n- [A Generic First-Order Algorithmic Framework for Bi-Level Programming Beyond Lower-Level Singleton (GDA)](https://arxiv.org/pdf/2006.04045.pdf)\n- [Towards gradient-based bilevel optimization with non-convex followers and beyond (PTT, DI)](https://proceedings.neurips.cc/paper_files/paper/2021/file/48bea99c85bcbaaba618ba10a6f69e44-Paper.pdf)\n- [Averaged Method of Multipliers for Bi-Level Optimization without Lower-Level Strong Convexity(DM)](https://proceedings.mlr.press/v202/liu23y/liu23y.pdf)\n- [Learning With Constraint Learning: New Perspective, Solution Strategy and Various Applications (IGA)](https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10430445)\n- [BOME! Bilevel Optimization Made Easy: A Simple First-Order Approach (VFM)](https://proceedings.neurips.cc/paper_files/paper/2022/file/6dddcff5b115b40c998a08fbd1cea4d7-Paper-Conference.pdf)\n- [A Value-Function-based Interior-point Method for Non-convex Bi-level Optimization (VSM)](http://proceedings.mlr.press/v139/liu21o/liu21o.pdf)\n- [On Penalty-based Bilevel Gradient Descent Method (PGDM)](https://proceedings.mlr.press/v202/shen23c/shen23c.pdf)\n- [Moreau Envelope for Nonconvex Bi-Level Optimization: A Single-loop and Hessian-free Solution Strategy (MESM)](https://arxiv.org/pdf/2405.09927)\n\n\n## License\n\nMIT License\n\nCopyright (c) 2024 Yaohua Liu\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Bilevel Optimization Toolkit in Python for Learning and Vision Tasks Based on Jittor",
    "version": "1.0.3",
    "project_urls": {
        "Homepage": "https://github.com/callous-youth/BOAT/tree/boat_jit"
    },
    "split_keywords": [
        "bilevel-optimization",
        " learning and vision",
        " python",
        " deep learning",
        " jittor"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "87018ab5bbadb623111985ece3e82220ad9977ac203110e139a582c2a18dd081",
                "md5": "5c3c955aee2ac3c22366388c3dd8881b",
                "sha256": "5acb6f3ef29f9790903a91814eac2480a01bb013f488aac982f1e59e55871be6"
            },
            "downloads": -1,
            "filename": "boat_jit-1.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5c3c955aee2ac3c22366388c3dd8881b",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8.0",
            "size": 74868,
            "upload_time": "2025-10-20T11:54:41",
            "upload_time_iso_8601": "2025-10-20T11:54:41.444679Z",
            "url": "https://files.pythonhosted.org/packages/87/01/8ab5bbadb623111985ece3e82220ad9977ac203110e139a582c2a18dd081/boat_jit-1.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "bdfdb52bd7b1f14e65b12b69556fa4fc36800a91003e4ebcc106f06c5aa6c9fd",
                "md5": "b7c48f3ad7a7f6cf5c80bdda71e1f139",
                "sha256": "c5d449395075fe074a39307a161cfd5f836945c36257f27ff233f7534e4085a5"
            },
            "downloads": -1,
            "filename": "boat_jit-1.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "b7c48f3ad7a7f6cf5c80bdda71e1f139",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8.0",
            "size": 55643,
            "upload_time": "2025-10-20T11:54:42",
            "upload_time_iso_8601": "2025-10-20T11:54:42.831617Z",
            "url": "https://files.pythonhosted.org/packages/bd/fd/b52bd7b1f14e65b12b69556fa4fc36800a91003e4ebcc106f06c5aa6c9fd/boat_jit-1.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-20 11:54:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "callous-youth",
    "github_project": "BOAT",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "numpy",
            "specs": []
        },
        {
            "name": "setuptools",
            "specs": []
        },
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "1.4"
                ]
            ]
        },
        {
            "name": "higher",
            "specs": []
        },
        {
            "name": "matplotlib",
            "specs": []
        }
    ],
    "lcname": "boat-jit"
}
        
Elapsed time: 2.34800s