optschedule


Nameoptschedule JSON
Version 1.0.0 PyPI version JSON
download
home_pagehttps://github.com/draktr/optschedule
SummaryFlexible parameter scheduler that can be implemented with proprietary and open source optimizers and algorithms.
upload_time2024-09-12 23:22:16
maintainerNone
docs_urlNone
authordraktr
requires_python>=3.6
licenseMIT License
keywords schedule optimization decay learning parameters training
VCS
bugtrack_url
requirements optschedule numpy
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OptSchedule

Flexible parameter scheduler that can be implemented with proprietary and open source optimizers and algorithms.

* Free software: MIT license
* Documentation: <https://optschedule.readthedocs.io/en/latest/>

## Installation

`optschedule` can be installed through Python's package installer pip. To install, run

```sh {"id":"01J7M0JC8N3WTV7BD2A0PNAZ37"}
pip install optschedule

```

in your terminal. Alternatively, install the package directly from GitHub

```sh {"id":"01J7M0JC8N3WTV7BD2A1HDP42N"}
git clone -b development https://github.com/draktr/optschedule.git
cd monte
python setup.py install

```

## Features

* Exponential decay (gradual and staircase)
* Cosine decay
* Inverse time decay (gradual and staircase)
* Polynomial decay
* Piecewise constant decay
* Constant schedule
* Geometric decay
* Arithmetic decay
* Time decay
* Step decay

## Advantages

* **FLEXIBLE** - the package is designed to be simple and compatible with existing implementations and custom algorithms
* **COMPREHENSIVE** - the package contains the largest collection of schedules of any Python package. For more, feel free to raise a feature request in Issues.
* **NUMBA FRIENDLY** - schedule produced by the package is compatible with Numba and will not cause any issues if the rest of the algorithm is Numba compatible. This can drastically speed up the algorithm.

## Usage

Package contains functions that return an array of elements that is useful as a pre-defined parameter schedule (e.g. learning rate). The package can also be used for manually assigning varying weights to abstract particles. Overall, due to the general nature of the package a user might finds its own particular application.

### Example: Variable Learning Rate in Gradient Descent Algorithm

In gradient descent algorithm, user might want to decrease the learning rate as the algorithm converges. This can improve the numerical stability of the algorithm, as well as decrease the approximation error. Simple implementation example is provided:

```python {"id":"01J7M0JC8P877MVMRHR5E7T7KX"}
import optschedule as sch

# Function to be minimized (objective function) $ f(x) = (x+2)^2 $
def foo(params):
    return (params[0] + 2) ** 2

# Creating a learning rate schedule
learning_rate = sch.exponential_decay(n_steps=1000, initial_value=0.1, decay_rate=0.5)

# Array with objective value
objective = np.zeros(1000)
# Initial parameter value
param = [10]
# Difference
d = 0.01

# Gradient Descent Algorithm
for epoch, l in enumerate(learning_rate):
    objective[epoch] = foo(param)
    difference_objective = foo([param[0]+d])
    param[0] = param[0] - l*(difference_objective - objective[epoch])/d

print(f"Solution: {param[0]}")

```

## Maintaining and Contributing

Feel free to reach out through Issues forum to add or request features. Any issues, bugs and improvement recommendations are very welcome.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/draktr/optschedule",
    "name": "optschedule",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "schedule, optimization, decay, learning, parameters, training",
    "author": "draktr",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/f9/bf/e554c4a04071204aca924642fe95d315d1d9b10d14d34bccaebd6d6599b6/optschedule-1.0.0.tar.gz",
    "platform": null,
    "description": "# OptSchedule\r\n\r\nFlexible parameter scheduler that can be implemented with proprietary and open source optimizers and algorithms.\r\n\r\n* Free software: MIT license\r\n* Documentation: <https://optschedule.readthedocs.io/en/latest/>\r\n\r\n## Installation\r\n\r\n`optschedule` can be installed through Python's package installer pip. To install, run\r\n\r\n```sh {\"id\":\"01J7M0JC8N3WTV7BD2A0PNAZ37\"}\r\npip install optschedule\r\n\r\n```\r\n\r\nin your terminal. Alternatively, install the package directly from GitHub\r\n\r\n```sh {\"id\":\"01J7M0JC8N3WTV7BD2A1HDP42N\"}\r\ngit clone -b development https://github.com/draktr/optschedule.git\r\ncd monte\r\npython setup.py install\r\n\r\n```\r\n\r\n## Features\r\n\r\n* Exponential decay (gradual and staircase)\r\n* Cosine decay\r\n* Inverse time decay (gradual and staircase)\r\n* Polynomial decay\r\n* Piecewise constant decay\r\n* Constant schedule\r\n* Geometric decay\r\n* Arithmetic decay\r\n* Time decay\r\n* Step decay\r\n\r\n## Advantages\r\n\r\n* **FLEXIBLE** - the package is designed to be simple and compatible with existing implementations and custom algorithms\r\n* **COMPREHENSIVE** - the package contains the largest collection of schedules of any Python package. For more, feel free to raise a feature request in Issues.\r\n* **NUMBA FRIENDLY** - schedule produced by the package is compatible with Numba and will not cause any issues if the rest of the algorithm is Numba compatible. This can drastically speed up the algorithm.\r\n\r\n## Usage\r\n\r\nPackage contains functions that return an array of elements that is useful as a pre-defined parameter schedule (e.g. learning rate). The package can also be used for manually assigning varying weights to abstract particles. Overall, due to the general nature of the package a user might finds its own particular application.\r\n\r\n### Example: Variable Learning Rate in Gradient Descent Algorithm\r\n\r\nIn gradient descent algorithm, user might want to decrease the learning rate as the algorithm converges. This can improve the numerical stability of the algorithm, as well as decrease the approximation error. Simple implementation example is provided:\r\n\r\n```python {\"id\":\"01J7M0JC8P877MVMRHR5E7T7KX\"}\r\nimport optschedule as sch\r\n\r\n# Function to be minimized (objective function) $ f(x) = (x+2)^2 $\r\ndef foo(params):\r\n    return (params[0] + 2) ** 2\r\n\r\n# Creating a learning rate schedule\r\nlearning_rate = sch.exponential_decay(n_steps=1000, initial_value=0.1, decay_rate=0.5)\r\n\r\n# Array with objective value\r\nobjective = np.zeros(1000)\r\n# Initial parameter value\r\nparam = [10]\r\n# Difference\r\nd = 0.01\r\n\r\n# Gradient Descent Algorithm\r\nfor epoch, l in enumerate(learning_rate):\r\n    objective[epoch] = foo(param)\r\n    difference_objective = foo([param[0]+d])\r\n    param[0] = param[0] - l*(difference_objective - objective[epoch])/d\r\n\r\nprint(f\"Solution: {param[0]}\")\r\n\r\n```\r\n\r\n## Maintaining and Contributing\r\n\r\nFeel free to reach out through Issues forum to add or request features. Any issues, bugs and improvement recommendations are very welcome.\r\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "Flexible parameter scheduler that can be implemented with proprietary and open source optimizers and algorithms.",
    "version": "1.0.0",
    "project_urls": {
        "Documentation": "https://optschedule.readthedocs.io/en/latest/",
        "Homepage": "https://github.com/draktr/optschedule",
        "Issues": "https://github.com/draktr/optschedule/issues"
    },
    "split_keywords": [
        "schedule",
        " optimization",
        " decay",
        " learning",
        " parameters",
        " training"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f9bfe554c4a04071204aca924642fe95d315d1d9b10d14d34bccaebd6d6599b6",
                "md5": "40bcb61a00b3a407b3cf734f7df993f4",
                "sha256": "dbc0cbee45f07be5e0f08c3441a90ff58f32d60cacbe2d52a3c5085365b6ebf8"
            },
            "downloads": -1,
            "filename": "optschedule-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "40bcb61a00b3a407b3cf734f7df993f4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 13552,
            "upload_time": "2024-09-12T23:22:16",
            "upload_time_iso_8601": "2024-09-12T23:22:16.906401Z",
            "url": "https://files.pythonhosted.org/packages/f9/bf/e554c4a04071204aca924642fe95d315d1d9b10d14d34bccaebd6d6599b6/optschedule-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-12 23:22:16",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "draktr",
    "github_project": "optschedule",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "optschedule",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.21.6"
                ]
            ]
        }
    ],
    "lcname": "optschedule"
}
        
Elapsed time: 2.40598s