pdequinox


Namepdequinox JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryNeural PDE Emulator Architectures in JAX built on top of Equinox.
upload_time2024-10-17 07:02:31
maintainerNone
docs_urlNone
authorFelix Koehler
requires_python~=3.10
licenseNone
keywords jax sciml deep-learning pde neural operator
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h4 align="center">PDE Emulator Architectures for <a href="https://github.com/patrick-kidger/equinox" target="_blank">Equinox</a>.</h4>

<p align="center">
<a href="https://pypi.org/project/pdequinox/">
  <img src="https://img.shields.io/pypi/v/pdequinox.svg" alt="PyPI">
</a>
<a href="https://github.com/ceyron/pdequinox/actions/workflows/test.yml">
  <img src="https://github.com/ceyron/pdequinox/actions/workflows/test.yml/badge.svg" alt="Tests">
</a>
<a href="https://fkoehler.site/pdequinox/">
  <img src="https://img.shields.io/badge/docs-latest-green" alt="docs-latest">
</a>
<a href="https://github.com/ceyron/pdequinox/releases">
  <img src="https://img.shields.io/github/v/release/ceyron/pdequinox?include_prereleases&label=changelog" alt="Changelog">
</a>
<a href="https://github.com/ceyron/pdequinox/blob/main/LICENSE.txt">
  <img src="https://img.shields.io/badge/license-MIT-blue" alt="License">
</a>
</p>

<p align="center">
  <a href="#installation">Installation</a> •
  <a href="#documentation">Documentation</a> •
  <a href="#quickstart">Quickstart</a> •
  <a href="#background">Background</a> •
  <a href="#features">Features</a> •
  <a href="#boundary-conditions">Boundary Conditions</a> •
  <!-- <a href="#constructors">Constructors</a> • -->
  <a href="#acknowledgements">Acknowledgements</a>
</p>

<p align="center">
    <img width=600 src="https://github.com/user-attachments/assets/8948f0e8-b879-468e-aaa2-158788b4d3f2">
</p>

A collection of neural architectures for emulating Partial Differential Equations (PDEs) in JAX agnostic to the spatial dimension (1D, 2D, 3D) and boundary conditions (Dirichlet, Neumann, Periodic). This package is built on top of [Equinox](https://github.com/patrick-kidger/equinox).

## Installation

```bash
pip install pdequinox
```

Requires Python 3.10+ and JAX 0.4.13+. 👉 [JAX install guide](https://jax.readthedocs.io/en/latest/installation.html).

## Documentation

The documentation is available at [fkoehler.site/pdequinox](https://fkoehler.site/pdequinox/).


## Quickstart

Train a UNet to become an emulator for the 1D Poisson equation.

```python
import jax
import jax.numpy as jnp
import equinox as eqx
import optax  # `pip install optax`
import pdequinox as pdeqx
from tqdm import tqdm  # `pip install tqdm`

force_fields, displacement_fields = pdeqx.sample_data.poisson_1d_dirichlet(
    key=jax.random.PRNGKey(0)
)

force_fields_train = force_fields[:800]
force_fields_test = force_fields[800:]
displacement_fields_train = displacement_fields[:800]
displacement_fields_test = displacement_fields[800:]

unet = pdeqx.arch.ClassicUNet(1, 1, 1, key=jax.random.PRNGKey(1))

def loss_fn(model, x, y):
    y_pref = jax.vmap(model)(x)
    return jnp.mean((y_pref - y) ** 2)

opt = optax.adam(3e-4)
opt_state = opt.init(eqx.filter(unet, eqx.is_array))

@eqx.filter_jit
def update_fn(model, state, x, y):
    loss, grad = eqx.filter_value_and_grad(loss_fn)(model, x, y)
    updates, new_state = opt.update(grad, state, model)
    new_model = eqx.apply_updates(model, updates)
    return new_model, new_state, loss

loss_history = []
shuffle_key = jax.random.PRNGKey(151)
for epoch in tqdm(range(100)):
    shuffle_key, subkey = jax.random.split(shuffle_key)

    for batch in pdeqx.dataloader(
        (force_fields_train, displacement_fields_train),
        batch_size=32,
        key=subkey
    ):
        unet, opt_state, loss = update_fn(
            unet,
            opt_state,
            *batch,
        )
        loss_history.append(loss)
```
## Background

Neural Emulators are networks learned to efficienty predict physical phenomena,
often associated with PDEs. In the simplest case this can be a linear advection
equation, all the way to more complicated Navier-Stokes cases. If we work on
Uniform Cartesian grids* (which this package assumes), one can borrow plenty of
architectures from image-to-image tasks in computer vision (e.g., for
segmentation). This includes:

* Standard Feedforward ConvNets
* Convolutional ResNets ([He et al.](https://arxiv.org/abs/1512.03385))
* U-Nets ([Ronneberger et al.](https://arxiv.org/abs/1505.04597))
* Dilated ResNets ([Yu et al.](https://arxiv.org/abs/1511.07122), [Stachenfeld et al.](https://arxiv.org/abs/2112.15275))
* Fourier Neural Operators ([Li et al.](https://arxiv.org/abs/2010.08895))

It is interesting to note that most of these architectures resemble classical
numerical methods or at least share similarities with them. For example,
ConvNets (or convolutions in general) are related to finite differences, while
U-Nets resemble multigrid methods. Fourier Neural Operators are related to
spectral methods. The difference is that the emulators' free parameters are
found based on a (data-driven) numerical optimization not a symbolic
manipulation of the differential equations.

(*) This means that we essentially have a pixel or voxel grid on which space is
discretized. Hence, the space can only be the scaled unit cube $\Omega = (0,
L)^D$

## Features

* Based on [JAX](https://github.com/google/jax):
  * One of the best Automatic Differentiation engines (forward & reverse)
  * Automatic vectorization
  * Backend-agnostic code (run on CPU, GPU, and TPU)
* Based on [Equinox](https://github.com/patrick-kidger/equinox):
  * Single-Batch by design
  * Integration into the Equinox SciML ecosystem
* Agnostic to the spatial dimension (works for 1D, 2D, and 3D)
* Agnostic to the boundary condition (works for Dirichlet, Neumann, and periodic
  BCs)
* Composability
* Tools to count parameters and assess receptive fields

## Boundary Conditions

This package assumes that the boundary condition is baked into the neural
emulator. Hence, most components allow setting `boundary_mode` which can be
`"dirichlet"`, `"neumann"`, or `"periodic"`. This affects what is considered a
degree of freedom in the grid.

![three_boundary_conditions](https://github.com/user-attachments/assets/a46c276c-4c4b-4890-aca2-49c8b04d1948)

Dirichlet boundaries fully eliminate degrees of freedom on the boundary.
Periodic boundaries only keep one end of the domain as a degree of freedom (This
package follows the convention that the left boundary is the degree of freedom). Neumann boundaries keep both ends as degrees of freedom.

<!-- ## Constructors

There are two primary architectural constructors for Sequential and Hierarchical
Networks that allow for composability with the `PDEquinox` blocks.

### Sequential Constructor

![sequential_net](https://github.com/user-attachments/assets/866f9cb9-5d6f-462e-8621-26b74526ae68)

The squential network constructor is defined by:
* a lifting block $\mathcal{L}$
* $N$ blocks $\left \{ \mathcal{B}_i \right\}_{i=1}^N$
* a projection block $\mathcal{P}$
* the hidden channels within the sequential processing
* the number of blocks $N$ (one can also supply a list of hidden channels if they shall be different between blocks)

### Hierarchical Constructor

![hierarchical_net](https://github.com/user-attachments/assets/b574c834-b8c8-476d-aabb-c121ba41d5c3)

The hierarchical network constructor is defined by:
* a lifting block $\mathcal{L}$
* The number of levels $D$ (i.e., the number of additional hierarchies). Setting $D = 0$ recovers the sequential processing.
* a list of $D$ blocks $\left \{ \mathcal{D}_i \right\}_{i=1}^D$ for
  downsampling, i.e. mapping downwards to the lower hierarchy (oftentimes this
  is that they halve the spatial axes while keeping the number of channels)
* a list of $D$ blocks $\left \{ \mathcal{B}_i^l \right\}_{i=1}^D$ for
  processing in the left arc (oftentimes this changes the number of channels,
  e.g. doubles it such that the combination of downsampling and left processing
  halves the spatial resolution and doubles the feature count)
* a list of $D$ blocks $\left \{ \mathcal{U}_i \right\}_{i=1}^D$ for upsamping,
  i.e., mapping upwards to the higher hierarchy (oftentimes this doubles the
  spatial resolution; at the same time it halves the feature count such that we
  can concatenate a skip connection)
* a list of $D$ blocks $\left \{ \mathcal{B}_i^r \right\}_{i=1}^D$ for
  processing in the right arc (oftentimes this changes the number of channels,
  e.g. halves it such that the combination of upsampling and right processing
  doubles the spatial resolution and halves the feature count)
* a projection block $\mathcal{P}$
* the hidden channels within the hierarchical processing (if just an integer is
  provided; this is assumed to be the number of hidden channels in the highest
  hierarchy.)

### Beyond Architectural Constructors

For completion, `pdequinox.arch` also provides a `ConvNet` which is a simple
feed-forward convolutional network. It also provides `MLP` which is a dense
networks which also requires pre-defining the number of resolution points. -->

## Acknowledgements

### Related Work

Similar packages that provide a collection of emulator architectures are
[PDEBench](https://github.com/pdebench/PDEBench) and
[PDEArena](https://github.com/pdearena/pdearena). With focus on Phyiscs-informed
Neural Networks and Neural Operators, there are also
[DeepXDE](https://github.com/lululxvi/deepxde) and [NVIDIA
Modulus](https://developer.nvidia.com/modulus).

### Citation

This package was developed as part of the `APEBench paper` (accepted at Neurips 2024), we will soon add the citation here.

### Funding

The main author (Felix Koehler) is a PhD student in the group of [Prof. Thuerey at TUM](https://ge.in.tum.de/) and his research is funded by the [Munich Center for Machine Learning](https://mcml.ai/).

### License

MIT, see [here](LICENSE.txt)

---

> [fkoehler.site](https://fkoehler.site/) &nbsp;&middot;&nbsp;
> GitHub [@ceyron](https://github.com/ceyron) &nbsp;&middot;&nbsp;
> X [@felix_m_koehler](https://twitter.com/felix_m_koehler) &nbsp;&middot;&nbsp;
> LinkedIn [Felix Köhler](www.linkedin.com/in/felix-koehler)

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "pdequinox",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "~=3.10",
    "maintainer_email": null,
    "keywords": "jax, sciml, deep-learning, pde, neural operator",
    "author": "Felix Koehler",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/24/9c/ff9718b174e1b98d113d6d62fceeeb15fc01b6cdbace1c9939b9f2bc4464/pdequinox-0.1.2.tar.gz",
    "platform": null,
    "description": "<h4 align=\"center\">PDE Emulator Architectures for <a href=\"https://github.com/patrick-kidger/equinox\" target=\"_blank\">Equinox</a>.</h4>\n\n<p align=\"center\">\n<a href=\"https://pypi.org/project/pdequinox/\">\n  <img src=\"https://img.shields.io/pypi/v/pdequinox.svg\" alt=\"PyPI\">\n</a>\n<a href=\"https://github.com/ceyron/pdequinox/actions/workflows/test.yml\">\n  <img src=\"https://github.com/ceyron/pdequinox/actions/workflows/test.yml/badge.svg\" alt=\"Tests\">\n</a>\n<a href=\"https://fkoehler.site/pdequinox/\">\n  <img src=\"https://img.shields.io/badge/docs-latest-green\" alt=\"docs-latest\">\n</a>\n<a href=\"https://github.com/ceyron/pdequinox/releases\">\n  <img src=\"https://img.shields.io/github/v/release/ceyron/pdequinox?include_prereleases&label=changelog\" alt=\"Changelog\">\n</a>\n<a href=\"https://github.com/ceyron/pdequinox/blob/main/LICENSE.txt\">\n  <img src=\"https://img.shields.io/badge/license-MIT-blue\" alt=\"License\">\n</a>\n</p>\n\n<p align=\"center\">\n  <a href=\"#installation\">Installation</a> \u2022\n  <a href=\"#documentation\">Documentation</a> \u2022\n  <a href=\"#quickstart\">Quickstart</a> \u2022\n  <a href=\"#background\">Background</a> \u2022\n  <a href=\"#features\">Features</a> \u2022\n  <a href=\"#boundary-conditions\">Boundary Conditions</a> \u2022\n  <!-- <a href=\"#constructors\">Constructors</a> \u2022 -->\n  <a href=\"#acknowledgements\">Acknowledgements</a>\n</p>\n\n<p align=\"center\">\n    <img width=600 src=\"https://github.com/user-attachments/assets/8948f0e8-b879-468e-aaa2-158788b4d3f2\">\n</p>\n\nA collection of neural architectures for emulating Partial Differential Equations (PDEs) in JAX agnostic to the spatial dimension (1D, 2D, 3D) and boundary conditions (Dirichlet, Neumann, Periodic). This package is built on top of [Equinox](https://github.com/patrick-kidger/equinox).\n\n## Installation\n\n```bash\npip install pdequinox\n```\n\nRequires Python 3.10+ and JAX 0.4.13+. \ud83d\udc49 [JAX install guide](https://jax.readthedocs.io/en/latest/installation.html).\n\n## Documentation\n\nThe documentation is available at [fkoehler.site/pdequinox](https://fkoehler.site/pdequinox/).\n\n\n## Quickstart\n\nTrain a UNet to become an emulator for the 1D Poisson equation.\n\n```python\nimport jax\nimport jax.numpy as jnp\nimport equinox as eqx\nimport optax  # `pip install optax`\nimport pdequinox as pdeqx\nfrom tqdm import tqdm  # `pip install tqdm`\n\nforce_fields, displacement_fields = pdeqx.sample_data.poisson_1d_dirichlet(\n    key=jax.random.PRNGKey(0)\n)\n\nforce_fields_train = force_fields[:800]\nforce_fields_test = force_fields[800:]\ndisplacement_fields_train = displacement_fields[:800]\ndisplacement_fields_test = displacement_fields[800:]\n\nunet = pdeqx.arch.ClassicUNet(1, 1, 1, key=jax.random.PRNGKey(1))\n\ndef loss_fn(model, x, y):\n    y_pref = jax.vmap(model)(x)\n    return jnp.mean((y_pref - y) ** 2)\n\nopt = optax.adam(3e-4)\nopt_state = opt.init(eqx.filter(unet, eqx.is_array))\n\n@eqx.filter_jit\ndef update_fn(model, state, x, y):\n    loss, grad = eqx.filter_value_and_grad(loss_fn)(model, x, y)\n    updates, new_state = opt.update(grad, state, model)\n    new_model = eqx.apply_updates(model, updates)\n    return new_model, new_state, loss\n\nloss_history = []\nshuffle_key = jax.random.PRNGKey(151)\nfor epoch in tqdm(range(100)):\n    shuffle_key, subkey = jax.random.split(shuffle_key)\n\n    for batch in pdeqx.dataloader(\n        (force_fields_train, displacement_fields_train),\n        batch_size=32,\n        key=subkey\n    ):\n        unet, opt_state, loss = update_fn(\n            unet,\n            opt_state,\n            *batch,\n        )\n        loss_history.append(loss)\n```\n## Background\n\nNeural Emulators are networks learned to efficienty predict physical phenomena,\noften associated with PDEs. In the simplest case this can be a linear advection\nequation, all the way to more complicated Navier-Stokes cases. If we work on\nUniform Cartesian grids* (which this package assumes), one can borrow plenty of\narchitectures from image-to-image tasks in computer vision (e.g., for\nsegmentation). This includes:\n\n* Standard Feedforward ConvNets\n* Convolutional ResNets ([He et al.](https://arxiv.org/abs/1512.03385))\n* U-Nets ([Ronneberger et al.](https://arxiv.org/abs/1505.04597))\n* Dilated ResNets ([Yu et al.](https://arxiv.org/abs/1511.07122), [Stachenfeld et al.](https://arxiv.org/abs/2112.15275))\n* Fourier Neural Operators ([Li et al.](https://arxiv.org/abs/2010.08895))\n\nIt is interesting to note that most of these architectures resemble classical\nnumerical methods or at least share similarities with them. For example,\nConvNets (or convolutions in general) are related to finite differences, while\nU-Nets resemble multigrid methods. Fourier Neural Operators are related to\nspectral methods. The difference is that the emulators' free parameters are\nfound based on a (data-driven) numerical optimization not a symbolic\nmanipulation of the differential equations.\n\n(*) This means that we essentially have a pixel or voxel grid on which space is\ndiscretized. Hence, the space can only be the scaled unit cube $\\Omega = (0,\nL)^D$\n\n## Features\n\n* Based on [JAX](https://github.com/google/jax):\n  * One of the best Automatic Differentiation engines (forward & reverse)\n  * Automatic vectorization\n  * Backend-agnostic code (run on CPU, GPU, and TPU)\n* Based on [Equinox](https://github.com/patrick-kidger/equinox):\n  * Single-Batch by design\n  * Integration into the Equinox SciML ecosystem\n* Agnostic to the spatial dimension (works for 1D, 2D, and 3D)\n* Agnostic to the boundary condition (works for Dirichlet, Neumann, and periodic\n  BCs)\n* Composability\n* Tools to count parameters and assess receptive fields\n\n## Boundary Conditions\n\nThis package assumes that the boundary condition is baked into the neural\nemulator. Hence, most components allow setting `boundary_mode` which can be\n`\"dirichlet\"`, `\"neumann\"`, or `\"periodic\"`. This affects what is considered a\ndegree of freedom in the grid.\n\n![three_boundary_conditions](https://github.com/user-attachments/assets/a46c276c-4c4b-4890-aca2-49c8b04d1948)\n\nDirichlet boundaries fully eliminate degrees of freedom on the boundary.\nPeriodic boundaries only keep one end of the domain as a degree of freedom (This\npackage follows the convention that the left boundary is the degree of freedom). Neumann boundaries keep both ends as degrees of freedom.\n\n<!-- ## Constructors\n\nThere are two primary architectural constructors for Sequential and Hierarchical\nNetworks that allow for composability with the `PDEquinox` blocks.\n\n### Sequential Constructor\n\n![sequential_net](https://github.com/user-attachments/assets/866f9cb9-5d6f-462e-8621-26b74526ae68)\n\nThe squential network constructor is defined by:\n* a lifting block $\\mathcal{L}$\n* $N$ blocks $\\left \\{ \\mathcal{B}_i \\right\\}_{i=1}^N$\n* a projection block $\\mathcal{P}$\n* the hidden channels within the sequential processing\n* the number of blocks $N$ (one can also supply a list of hidden channels if they shall be different between blocks)\n\n### Hierarchical Constructor\n\n![hierarchical_net](https://github.com/user-attachments/assets/b574c834-b8c8-476d-aabb-c121ba41d5c3)\n\nThe hierarchical network constructor is defined by:\n* a lifting block $\\mathcal{L}$\n* The number of levels $D$ (i.e., the number of additional hierarchies). Setting $D = 0$ recovers the sequential processing.\n* a list of $D$ blocks $\\left \\{ \\mathcal{D}_i \\right\\}_{i=1}^D$ for\n  downsampling, i.e. mapping downwards to the lower hierarchy (oftentimes this\n  is that they halve the spatial axes while keeping the number of channels)\n* a list of $D$ blocks $\\left \\{ \\mathcal{B}_i^l \\right\\}_{i=1}^D$ for\n  processing in the left arc (oftentimes this changes the number of channels,\n  e.g. doubles it such that the combination of downsampling and left processing\n  halves the spatial resolution and doubles the feature count)\n* a list of $D$ blocks $\\left \\{ \\mathcal{U}_i \\right\\}_{i=1}^D$ for upsamping,\n  i.e., mapping upwards to the higher hierarchy (oftentimes this doubles the\n  spatial resolution; at the same time it halves the feature count such that we\n  can concatenate a skip connection)\n* a list of $D$ blocks $\\left \\{ \\mathcal{B}_i^r \\right\\}_{i=1}^D$ for\n  processing in the right arc (oftentimes this changes the number of channels,\n  e.g. halves it such that the combination of upsampling and right processing\n  doubles the spatial resolution and halves the feature count)\n* a projection block $\\mathcal{P}$\n* the hidden channels within the hierarchical processing (if just an integer is\n  provided; this is assumed to be the number of hidden channels in the highest\n  hierarchy.)\n\n### Beyond Architectural Constructors\n\nFor completion, `pdequinox.arch` also provides a `ConvNet` which is a simple\nfeed-forward convolutional network. It also provides `MLP` which is a dense\nnetworks which also requires pre-defining the number of resolution points. -->\n\n## Acknowledgements\n\n### Related Work\n\nSimilar packages that provide a collection of emulator architectures are\n[PDEBench](https://github.com/pdebench/PDEBench) and\n[PDEArena](https://github.com/pdearena/pdearena). With focus on Phyiscs-informed\nNeural Networks and Neural Operators, there are also\n[DeepXDE](https://github.com/lululxvi/deepxde) and [NVIDIA\nModulus](https://developer.nvidia.com/modulus).\n\n### Citation\n\nThis package was developed as part of the `APEBench paper` (accepted at Neurips 2024), we will soon add the citation here.\n\n### Funding\n\nThe main author (Felix Koehler) is a PhD student in the group of [Prof. Thuerey at TUM](https://ge.in.tum.de/) and his research is funded by the [Munich Center for Machine Learning](https://mcml.ai/).\n\n### License\n\nMIT, see [here](LICENSE.txt)\n\n---\n\n> [fkoehler.site](https://fkoehler.site/) &nbsp;&middot;&nbsp;\n> GitHub [@ceyron](https://github.com/ceyron) &nbsp;&middot;&nbsp;\n> X [@felix_m_koehler](https://twitter.com/felix_m_koehler) &nbsp;&middot;&nbsp;\n> LinkedIn [Felix K\u00f6hler](www.linkedin.com/in/felix-koehler)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Neural PDE Emulator Architectures in JAX built on top of Equinox.",
    "version": "0.1.2",
    "project_urls": {
        "repository": "https://github.com/Ceyron/pdequinox"
    },
    "split_keywords": [
        "jax",
        " sciml",
        " deep-learning",
        " pde",
        " neural operator"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "497ee0d36d65a5495eb71e1a3ce2e5f2027f9b9c86d0ab2f41a43c2735f83ebb",
                "md5": "6f9e2636b8c8a1fe0f409f44c05525fc",
                "sha256": "b022b38eb03fa7ce4d20622cb9dd2b36f1b80730f082cd2b757279bb7d1111fa"
            },
            "downloads": -1,
            "filename": "pdequinox-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "6f9e2636b8c8a1fe0f409f44c05525fc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "~=3.10",
            "size": 49572,
            "upload_time": "2024-10-17T07:02:30",
            "upload_time_iso_8601": "2024-10-17T07:02:30.761600Z",
            "url": "https://files.pythonhosted.org/packages/49/7e/e0d36d65a5495eb71e1a3ce2e5f2027f9b9c86d0ab2f41a43c2735f83ebb/pdequinox-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "249cff9718b174e1b98d113d6d62fceeeb15fc01b6cdbace1c9939b9f2bc4464",
                "md5": "09c0aaee9b4c5834414a7c55b4c378c6",
                "sha256": "7ee9dcbf277cbb94cda508034c0955600a03bc4c664bede5eb61b4a4b99b54c5"
            },
            "downloads": -1,
            "filename": "pdequinox-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "09c0aaee9b4c5834414a7c55b4c378c6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "~=3.10",
            "size": 37809,
            "upload_time": "2024-10-17T07:02:31",
            "upload_time_iso_8601": "2024-10-17T07:02:31.786841Z",
            "url": "https://files.pythonhosted.org/packages/24/9c/ff9718b174e1b98d113d6d62fceeeb15fc01b6cdbace1c9939b9f2bc4464/pdequinox-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-17 07:02:31",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Ceyron",
    "github_project": "pdequinox",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "pdequinox"
}
        
Elapsed time: 1.09061s