pyeas


Namepyeas JSON
Version 0.2.0 PyPI version JSON
download
home_pagehttps://github.com/benedictjones/pyeas
SummaryImplements Evolutionary Algorithms and tools
upload_time2023-08-13 18:44:25
maintainer
docs_urlNone
authorbenedictjones
requires_python
licenseBSD License (BSD-3-Clause)
keywords python differential evolution de openai evolutionary strategy openai-es cmaes page trend test
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyEAs
Python library implementing some Evolutionary Algorithms.
These follow an ask/tell work flow where one:
- new trial members/sample are requested using ask(), these need to me manually evaluated,
- the fitness socores are then fed back to the object with tell() to enable a population update.
This structure enables the code to be used simply with any model.

Current features include:
- Three EAs (DE, OpenAI-ES, CMAES)
- Page Trend test for convergence statistical analysis

## Algorithms

### DE
Differential Evolution (DE) is an easily implemented and effective optimisation algorithm for exploiting real-world parameters. 
DE is a derivative-free, stochastic, population-based, heuristic direct search method [1] which only requires a few robust control variables. 

A simple work flow might be:
```
from pyeas import DE

# call object passing in hyperparameters and bounderies for each optimisable parameter
optimizer = DE(mut=0.6,
               crossp=0.6,
               bounds=np.array([[-5,5],[-5,5]]),

for generation in range(num_gens):

    # ask for a trial population
    trial_pop = optimizer.ask(loop=generation)

    # Loop through trial pop and evaluate their fitnesses 
    solutions = []
    for trial in trial_pop:
        fitness = f(trial)
        solutions.append((value))

    # Tell the object the evaluated values.
    optimizer.tell(solutions, trial_pop)
```

An Example of DE/best/1/bin solving some simple problems, as generated by `eg_funcs.py`:

<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_bohachevsky.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_6hc.gif" width="49%" />
</p>
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_beale.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_kean.gif" width="49%" />
</p>


An example of DE/ttb/1/bin solving a 5th order polynomial to fit noisy cos(x) data.

![](https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE.gif)


### OpenAI ES
Evolutionary Strategies (ES) involve the evaluation of a population of real valued genotypes (or population members), after which the best members are kept, and others discarded.

Natural Evolutionary Strategies (NES) are a family of Evolution Strategies which iteratively update a search distribution by using an estimated gradient on its distribution parameters.
Notably, NES performs gradient accent along the natural gradient.

The OpenAI Evolutionary Stratergy (OAIES) algorithm is a type of NES [2], implemented here as vanilla gradient decent, with momentum, or with adam optimiser [3].

A simple work flow might be:
```
from pyeas import OAIES

# call object passing in hyperparameters and bounderies for each optimisable parameter
optimizer = OAIES(alpha=0.01,
                 sigma=0.002,
                 bounds=np.array([[-10,10],[-10,10]]),
                 population_size=20,
                 optimiser = 'adam',
                 seed=1)

for generation in range(num_gens):

    # ask for a pseudo-population
    trial_pop = optimizer.ask(loop=generation)

    # Loop through pseudo-pop and evaluate their fitnesses 
    solutions = []
    for trial in trial_pop:
        fitness = f(trial)
        solutions.append((value))

    # Tell the object the evaluated values.
    optimizer.tell(solutions, trial_pop)

    # Calc the new parent fitness, and tell again!
    parent_fit = f(optimizer.parent)
    optimizer.tellAgain(parent_fit)

```


An Example of OAIES solving some simple problems, as generated by `eg_funcs.py`:
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_bohachevsky.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_6hc.gif" width="49%" />
</p>
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_beale.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_kean.gif" width="49%" />
</p>


### CMAES
The Covariance Matrix Adaptation Evolution Strategy (CMAES) algorithm is a popular stochastic method for real-parameter (continuous domain) optimization [4].
In the CMAES, a population of new search points is generated by sampling a Multivariate Gaussian Distribution (MGD), fitness results are then used to update and adapt the MGD.

A simple work flow might be:
```
from pyeas import OAIES

# call object passing in hyperparameters and bounderies for each optimisable parameter
optimizer = CMAES(start='mean',
                  sigma=0.002,
                  bounds=np.array(bound),
                  seed=2)

for generation in range(num_gens):

    # ask for a pseudo-population
    trial_pop = optimizer.ask()

    # Loop through pseudo-pop and evaluate their fitnesses 
    solutions = []
    for trial in trial_pop:
        fitness = f(trial)
        solutions.append((value))

    # Tell the object the evaluated values.
    optimizer.tell(solutions, trial_pop)

    # Calc the new parent fitness, and tell again!
    parent_fit = f(optimizer.parent)
    optimizer.tellAgain(parent_fit)

```

This implementation of CMAES can have its starting location as either:
- A passed in array denoting starting values
- The mean of the boundaries ('mean')
- A random initital population used to select a good starting point ('random')


An Example of CMAES solving some simple problems (with random starting locations), as generated by `eg_funcs.py`:
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_bohachevsky.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_6hc.gif" width="49%" />
</p>
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_beale.gif" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_kean.gif" width="49%" />
</p>

## Comparing Algorithms

We can notice that different hyperparameter values might result in wildly different convergence speeds per amount of evaluations/computations executed.
In cases where completing an evaluation is computationally expensive, ideally the algorithm should provide meaningful generational updates while only requiring a minimal number of evaluations in that generation [5].

Consider how we might set a maximum number of computations/evaluations limit (e.g., for the DE algorithm):
```
optimizer = DE(mut=0.6,
               crossp=0.6,
               bounds=np.array([[-5,5],[-5,5]]),

 # Break after set number of computations (i.e., evaluations)
while optimizer.evals < max_ncomps:

    # ask for a trial population
    trial_pop = optimizer.ask(loop=generation)

    # Loop through trial pop and evaluate their fitnesses 
    solutions = []
    for trial in trial_pop:
        fitness = f(trial)
        solutions.append((value))

    # Tell the object the evaluated values.
    optimizer.tell(solutions, trial_pop)
```

We can then use this to consider fairer comparisons between alorithms.
Consider some results from `eg_funcs_ncomps.py` showing the mean parent fitness covergence:

<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_bohachevsky.png" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_6hc.png" width="49%" />
</p>
<p float="left">
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_beale.png" width="49%" />
  <img src="https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_kean.png" width="49%" />
</p>

Notice how CMAES and OAIES can have fitnesses which go up and down, unlike the fitnesses of the DE parent(s) which can only ever decrease.

Also notice how CMAES has many more generational updated in the allotted 500 computations/evaluations, where as the DE algorithm has fewer generational updates since (in this example) the population size is set much larger.
we are now in the realm of hyperparamater tuning...


### Page Trend Test

Finnaly, to actually do a statisitcal test and compare EA convergence, we can use the page trend test [6], implemented here.
We also address the posability that two algorithms might have different x-axis values (as in the previous section). An example is given in `eg_PageTest_interp.py`.

## Additional Functionality

### Groupings

You might not always want a population member to be a 1d array with a corresponding boundary array. e.g., 
- member: [-0.5, -1.9, 1.6, -2, 8], 
- boundaries: [ [-1,1], [-2,2], [-2,2], [-10,10], [-10,10] ]

Instead, we might want to group a member into sub arrays corresponding to a list of boundaries. We can do this using the 'groupings' argument. e.g., 
- member: [[-0.5], [-1.9, 1.6], [-2, 8]], 
- boundaries: [ [-1,1], [-2,2], [-10,10] ]
- grouping: [1, 2, 2]



## References 

[1] R. Storn and K. Price, “Differential Evolution – A Simple and Efficient Heuristic for global Optimization over Continuous Spaces,” Journal of Global Optimization, vol. 11, no. 4, pp. 341–359, Dec. 1997. [Online]. Available: https://doi.org/10.1023/A:1008202821328

[2] T. Salimans, J. Ho, X. Chen, S. Sidor, and I. Sutskever, “Evolution Strategies as a Scalable Alternative to Reinforcement Learning,” Sep. 2017. [Online]. Available: http://arxiv.org/abs/1703.03864

[3] D. P. Kingma and J. Ba, “Adam: A Method for Stochastic Optimization,” arXiv, Jan. 2017. [Online]. Available: http://arxiv.org/abs/1412.6980

[4] Hansen, N. (2016). The CMA Evolution Strategy: A Tutorial. Available: https://arxiv.org/abs/1604.00772

[5] Benedict. A. H. Jones, N. Al Moubayed, D. A. Zeze, and C. Groves, ‘Enhanced methods for Evolution in-Materio Processors’, in 2021 International Conference on Rebooting Computing (ICRC), Nov. 2021, pp. 109–118. http://doi.org/10.1109/ICRC53822.2021.00026.

[6] Derrac, J., Garc ́ıa, S., Hui, S., Suganthan, P. N., and Herrera, F. (2014). Analyzing con-
vergence performance of evolutionary algorithms: A statistical approach. Information
Sciences, 289:41–58. http://dx.doi.org/10.1016/j.ins.2014.06.009 


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/benedictjones/pyeas",
    "name": "pyeas",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "python,Differential Evolution,DE,OpenAI Evolutionary Strategy,OpenAI-ES,CMAES,Page Trend Test",
    "author": "benedictjones",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/43/a2/63580345ee55794194a69931299807544211d7b478826f9a878a25a7975d/pyeas-0.2.0.tar.gz",
    "platform": null,
    "description": "# PyEAs\nPython library implementing some Evolutionary Algorithms.\nThese follow an ask/tell work flow where one:\n- new trial members/sample are requested using ask(), these need to me manually evaluated,\n- the fitness socores are then fed back to the object with tell() to enable a population update.\nThis structure enables the code to be used simply with any model.\n\nCurrent features include:\n- Three EAs (DE, OpenAI-ES, CMAES)\n- Page Trend test for convergence statistical analysis\n\n## Algorithms\n\n### DE\nDifferential Evolution (DE) is an easily implemented and effective optimisation algorithm for exploiting real-world parameters. \nDE is a derivative-free, stochastic, population-based, heuristic direct search method [1] which only requires a few robust control variables. \n\nA simple work flow might be:\n```\nfrom pyeas import DE\n\n# call object passing in hyperparameters and bounderies for each optimisable parameter\noptimizer = DE(mut=0.6,\n               crossp=0.6,\n               bounds=np.array([[-5,5],[-5,5]]),\n\nfor generation in range(num_gens):\n\n    # ask for a trial population\n    trial_pop = optimizer.ask(loop=generation)\n\n    # Loop through trial pop and evaluate their fitnesses \n    solutions = []\n    for trial in trial_pop:\n        fitness = f(trial)\n        solutions.append((value))\n\n    # Tell the object the evaluated values.\n    optimizer.tell(solutions, trial_pop)\n```\n\nAn Example of DE/best/1/bin solving some simple problems, as generated by `eg_funcs.py`:\n\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_bohachevsky.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_6hc.gif\" width=\"49%\" />\n</p>\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_beale.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE_kean.gif\" width=\"49%\" />\n</p>\n\n\nAn example of DE/ttb/1/bin solving a 5th order polynomial to fit noisy cos(x) data.\n\n![](https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/DE.gif)\n\n\n### OpenAI ES\nEvolutionary Strategies (ES) involve the evaluation of a population of real valued genotypes (or population members), after which the best members are kept, and others discarded.\n\nNatural Evolutionary Strategies (NES) are a family of Evolution Strategies which iteratively update a search distribution by using an estimated gradient on its distribution parameters.\nNotably, NES performs gradient accent along the natural gradient.\n\nThe OpenAI Evolutionary Stratergy (OAIES) algorithm is a type of NES [2], implemented here as vanilla gradient decent, with momentum, or with adam optimiser [3].\n\nA simple work flow might be:\n```\nfrom pyeas import OAIES\n\n# call object passing in hyperparameters and bounderies for each optimisable parameter\noptimizer = OAIES(alpha=0.01,\n                 sigma=0.002,\n                 bounds=np.array([[-10,10],[-10,10]]),\n                 population_size=20,\n                 optimiser = 'adam',\n                 seed=1)\n\nfor generation in range(num_gens):\n\n    # ask for a pseudo-population\n    trial_pop = optimizer.ask(loop=generation)\n\n    # Loop through pseudo-pop and evaluate their fitnesses \n    solutions = []\n    for trial in trial_pop:\n        fitness = f(trial)\n        solutions.append((value))\n\n    # Tell the object the evaluated values.\n    optimizer.tell(solutions, trial_pop)\n\n    # Calc the new parent fitness, and tell again!\n    parent_fit = f(optimizer.parent)\n    optimizer.tellAgain(parent_fit)\n\n```\n\n\nAn Example of OAIES solving some simple problems, as generated by `eg_funcs.py`:\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_bohachevsky.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_6hc.gif\" width=\"49%\" />\n</p>\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_beale.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/OAIES_kean.gif\" width=\"49%\" />\n</p>\n\n\n### CMAES\nThe Covariance Matrix Adaptation Evolution Strategy (CMAES) algorithm is a popular stochastic method for real-parameter (continuous domain) optimization [4].\nIn the CMAES, a population of new search points is generated by sampling a Multivariate Gaussian Distribution (MGD), fitness results are then used to update and adapt the MGD.\n\nA simple work flow might be:\n```\nfrom pyeas import OAIES\n\n# call object passing in hyperparameters and bounderies for each optimisable parameter\noptimizer = CMAES(start='mean',\n                  sigma=0.002,\n                  bounds=np.array(bound),\n                  seed=2)\n\nfor generation in range(num_gens):\n\n    # ask for a pseudo-population\n    trial_pop = optimizer.ask()\n\n    # Loop through pseudo-pop and evaluate their fitnesses \n    solutions = []\n    for trial in trial_pop:\n        fitness = f(trial)\n        solutions.append((value))\n\n    # Tell the object the evaluated values.\n    optimizer.tell(solutions, trial_pop)\n\n    # Calc the new parent fitness, and tell again!\n    parent_fit = f(optimizer.parent)\n    optimizer.tellAgain(parent_fit)\n\n```\n\nThis implementation of CMAES can have its starting location as either:\n- A passed in array denoting starting values\n- The mean of the boundaries ('mean')\n- A random initital population used to select a good starting point ('random')\n\n\nAn Example of CMAES solving some simple problems (with random starting locations), as generated by `eg_funcs.py`:\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_bohachevsky.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_6hc.gif\" width=\"49%\" />\n</p>\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_beale.gif\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/CMAES_kean.gif\" width=\"49%\" />\n</p>\n\n## Comparing Algorithms\n\nWe can notice that different hyperparameter values might result in wildly different convergence speeds per amount of evaluations/computations executed.\nIn cases where completing an evaluation is computationally expensive, ideally the algorithm should provide meaningful generational updates while only requiring a minimal number of evaluations in that generation [5].\n\nConsider how we might set a maximum number of computations/evaluations limit (e.g., for the DE algorithm):\n```\noptimizer = DE(mut=0.6,\n               crossp=0.6,\n               bounds=np.array([[-5,5],[-5,5]]),\n\n # Break after set number of computations (i.e., evaluations)\nwhile optimizer.evals < max_ncomps:\n\n    # ask for a trial population\n    trial_pop = optimizer.ask(loop=generation)\n\n    # Loop through trial pop and evaluate their fitnesses \n    solutions = []\n    for trial in trial_pop:\n        fitness = f(trial)\n        solutions.append((value))\n\n    # Tell the object the evaluated values.\n    optimizer.tell(solutions, trial_pop)\n```\n\nWe can then use this to consider fairer comparisons between alorithms.\nConsider some results from `eg_funcs_ncomps.py` showing the mean parent fitness covergence:\n\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_bohachevsky.png\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_6hc.png\" width=\"49%\" />\n</p>\n<p float=\"left\">\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_beale.png\" width=\"49%\" />\n  <img src=\"https://raw.githubusercontent.com/benedictjones/pyeas/main/examples/EAs_comparison_kean.png\" width=\"49%\" />\n</p>\n\nNotice how CMAES and OAIES can have fitnesses which go up and down, unlike the fitnesses of the DE parent(s) which can only ever decrease.\n\nAlso notice how CMAES has many more generational updated in the allotted 500 computations/evaluations, where as the DE algorithm has fewer generational updates since (in this example) the population size is set much larger.\nwe are now in the realm of hyperparamater tuning...\n\n\n### Page Trend Test\n\nFinnaly, to actually do a statisitcal test and compare EA convergence, we can use the page trend test [6], implemented here.\nWe also address the posability that two algorithms might have different x-axis values (as in the previous section). An example is given in `eg_PageTest_interp.py`.\n\n## Additional Functionality\n\n### Groupings\n\nYou might not always want a population member to be a 1d array with a corresponding boundary array. e.g., \n- member: [-0.5, -1.9, 1.6, -2, 8], \n- boundaries: [ [-1,1], [-2,2], [-2,2], [-10,10], [-10,10] ]\n\nInstead, we might want to group a member into sub arrays corresponding to a list of boundaries. We can do this using the 'groupings' argument. e.g., \n- member: [[-0.5], [-1.9, 1.6], [-2, 8]], \n- boundaries: [ [-1,1], [-2,2], [-10,10] ]\n- grouping: [1, 2, 2]\n\n\n\n## References \n\n[1] R. Storn and K. Price, \u201cDifferential Evolution \u2013 A Simple and Efficient Heuristic for global Optimization over Continuous Spaces,\u201d Journal of Global Optimization, vol. 11, no. 4, pp. 341\u2013359, Dec. 1997. [Online]. Available: https://doi.org/10.1023/A:1008202821328\n\n[2] T. Salimans, J. Ho, X. Chen, S. Sidor, and I. Sutskever, \u201cEvolution Strategies as a Scalable Alternative to Reinforcement Learning,\u201d Sep. 2017. [Online]. Available: http://arxiv.org/abs/1703.03864\n\n[3] D. P. Kingma and J. Ba, \u201cAdam: A Method for Stochastic Optimization,\u201d arXiv, Jan. 2017. [Online]. Available: http://arxiv.org/abs/1412.6980\n\n[4] Hansen, N. (2016). The CMA Evolution Strategy: A Tutorial. Available: https://arxiv.org/abs/1604.00772\n\n[5] Benedict. A. H. Jones, N. Al Moubayed, D. A. Zeze, and C. Groves, \u2018Enhanced methods for Evolution in-Materio Processors\u2019, in 2021 International Conference on Rebooting Computing (ICRC), Nov. 2021, pp. 109\u2013118. http://doi.org/10.1109/ICRC53822.2021.00026.\n\n[6] Derrac, J., Garc \u0301\u0131a, S., Hui, S., Suganthan, P. N., and Herrera, F. (2014). Analyzing con-\nvergence performance of evolutionary algorithms: A statistical approach. Information\nSciences, 289:41\u201358. http://dx.doi.org/10.1016/j.ins.2014.06.009 \n\n",
    "bugtrack_url": null,
    "license": "BSD License (BSD-3-Clause)",
    "summary": "Implements Evolutionary Algorithms and tools",
    "version": "0.2.0",
    "project_urls": {
        "Homepage": "https://github.com/benedictjones/pyeas"
    },
    "split_keywords": [
        "python",
        "differential evolution",
        "de",
        "openai evolutionary strategy",
        "openai-es",
        "cmaes",
        "page trend test"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f22a2a90b8f80ad7210403c19496448156457073bc79b186a67cb09d56b86760",
                "md5": "506002a81aebf995c35b10ba8891f248",
                "sha256": "a5b9a145d868b3a4dbeb76ae9580b1f76d4532c61cd89eadd8578f42650e9b17"
            },
            "downloads": -1,
            "filename": "pyeas-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "506002a81aebf995c35b10ba8891f248",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 27242,
            "upload_time": "2023-08-13T18:44:10",
            "upload_time_iso_8601": "2023-08-13T18:44:10.589618Z",
            "url": "https://files.pythonhosted.org/packages/f2/2a/2a90b8f80ad7210403c19496448156457073bc79b186a67cb09d56b86760/pyeas-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "43a263580345ee55794194a69931299807544211d7b478826f9a878a25a7975d",
                "md5": "644e341e4cc8797e4d22f0596f99b11a",
                "sha256": "1b02a211d4fb15914e9d51569947ddb5af866e38852226748ec7c00042ff335e"
            },
            "downloads": -1,
            "filename": "pyeas-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "644e341e4cc8797e4d22f0596f99b11a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 27616197,
            "upload_time": "2023-08-13T18:44:25",
            "upload_time_iso_8601": "2023-08-13T18:44:25.159086Z",
            "url": "https://files.pythonhosted.org/packages/43/a2/63580345ee55794194a69931299807544211d7b478826f9a878a25a7975d/pyeas-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-13 18:44:25",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "benedictjones",
    "github_project": "pyeas",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "pyeas"
}
        
Elapsed time: 0.11627s