Name | porgo JSON |
Version |
1.0.1
JSON |
| download |
home_page | https://github.com/linjing-lab/porgo |
Summary | The portable universal library in global optimization. |
upload_time | 2024-04-09 10:41:49 |
maintainer | None |
docs_url | None |
author | 林景 |
requires_python | None |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# porgo
When I was researching a function without given all local minima, like the underlined function:
$$
f(x)=\sum_{i=1}^{n/2}(-13+x_{2i-1}+((5-x_{2i})x_{2i}-2)x_{2i})^2+(-29+x_{2i-1}+((x_{2i}+1)x_{2i}-14)x_{2i})^2.
$$
I used `optimtool.unconstrain` to search local minima, got an efficient experience about searching the nearest minimum point. Add a mechanism to jump out of the local area would increase the runtime of the whole script, so `porgo` is a new progam to accelerate to search global minima.
refer to test.py and the global minima of 4-dimensional $f(x)$ is (5, 4, 5, 4).
## glos
glos is the main runtime to serve as a global search class, users can run train_gen module with given cycles at any times until the function searching process converged.
init:
- objective_function: *Callable*, a high-dimensional function with convex, non-convex, and many local minima.
- bounds: *List[List[float]] | List[Tuple[float]]*, changes this value makes a significant influence of best and best_fit.
- mutation: *float=0.5*, increase this value makes the search radius larger.
- recombination: *float=0.9*, increase this value allows larger number of mutation.
rand_pop:
- population_size: *int=50*, randomly init the population (or called initial points) with shape at (population, dimension).
- verbose: *bool=False*, whether to output initial population when manually replace the random generated rule.
train_gen:
- cycles: *int=1000*, try to run several times (until converged) when give a smaller cycle number if search bounds is in large space.
result:
- verbose: *bool=False*, whether to output console information after search populations were updated (check self.best and self.best_fit).
## reference
Storn, R and Price, K, Differential Evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, Journal of Global Optimization, 1997, 11, 341 - 359.
## LICENSE
[MIT LICENSE](./LICENSE)
Raw data
{
"_id": null,
"home_page": "https://github.com/linjing-lab/porgo",
"name": "porgo",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": null,
"author": "\u6797\u666f",
"author_email": "linjing010729@163.com",
"download_url": "https://github.com/linjing-lab/porgo/tags",
"platform": null,
"description": "# porgo\n\nWhen I was researching a function without given all local minima, like the underlined function:\n\n$$\nf(x)=\\sum_{i=1}^{n/2}(-13+x_{2i-1}+((5-x_{2i})x_{2i}-2)x_{2i})^2+(-29+x_{2i-1}+((x_{2i}+1)x_{2i}-14)x_{2i})^2.\n$$\n\nI used `optimtool.unconstrain` to search local minima, got an efficient experience about searching the nearest minimum point. Add a mechanism to jump out of the local area would increase the runtime of the whole script, so `porgo` is a new progam to accelerate to search global minima.\n\nrefer to test.py and the global minima of 4-dimensional $f(x)$ is (5, 4, 5, 4).\n\n## glos\n\nglos is the main runtime to serve as a global search class, users can run train_gen module with given cycles at any times until the function searching process converged.\n\ninit:\n- objective_function: *Callable*, a high-dimensional function with convex, non-convex, and many local minima.\n- bounds: *List[List[float]] | List[Tuple[float]]*, changes this value makes a significant influence of best and best_fit.\n- mutation: *float=0.5*, increase this value makes the search radius larger.\n- recombination: *float=0.9*, increase this value allows larger number of mutation.\n\nrand_pop:\n- population_size: *int=50*, randomly init the population (or called initial points) with shape at (population, dimension).\n- verbose: *bool=False*, whether to output initial population when manually replace the random generated rule.\n\ntrain_gen:\n- cycles: *int=1000*, try to run several times (until converged) when give a smaller cycle number if search bounds is in large space.\n\nresult:\n- verbose: *bool=False*, whether to output console information after search populations were updated (check self.best and self.best_fit).\n\n## reference\n\nStorn, R and Price, K, Differential Evolution - a Simple and Efficient Heuristic for Global Optimization over Continuous Spaces, Journal of Global Optimization, 1997, 11, 341 - 359.\n\n## LICENSE\n\n[MIT LICENSE](./LICENSE)\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "The portable universal library in global optimization.",
"version": "1.0.1",
"project_urls": {
"Download": "https://github.com/linjing-lab/porgo/tags",
"Homepage": "https://github.com/linjing-lab/porgo",
"Source": "https://github.com/linjing-lab/porgo/tree/master/porgo/",
"Tracker": "https://github.com/linjing-lab/porgo/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "0025e623877617144fb0eb3d301a0a11b932f8cae4618f82fd95623efc67c475",
"md5": "d86f4bdf1d116de5695b3645b54ffc7f",
"sha256": "29cae8e61c533dfb50f49fb6c444d525da6b33ededa9f44e6a74cf3756ec8f87"
},
"downloads": -1,
"filename": "porgo-1.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d86f4bdf1d116de5695b3645b54ffc7f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 5512,
"upload_time": "2024-04-09T10:41:49",
"upload_time_iso_8601": "2024-04-09T10:41:49.907014Z",
"url": "https://files.pythonhosted.org/packages/00/25/e623877617144fb0eb3d301a0a11b932f8cae4618f82fd95623efc67c475/porgo-1.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-09 10:41:49",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "linjing-lab",
"github_project": "porgo",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "porgo"
}