mono-dense-keras


Namemono-dense-keras JSON
Version 0.1.0 PyPI version JSON
download
home_pagehttps://github.com/airtai/mono-dense-keras
SummaryMonotonic Dense Layer implemented in Keras
upload_time2023-06-02 14:21:20
maintainer
docs_urlNone
authorAIRT Technologies d.o.o.
requires_python>=3.8
licenseCreative Commons License
keywords tensorflow monotone monotonic dense layer nbdev nbdev-mkdocs jupyter notebook python
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            Monotonic Dense Layer
================

<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->

This Python library implements Monotonic Dense Layer as described in
Davor Runje, Sharath M. Shankaranarayana, “Constrained Monotonic Neural
Networks” \[[PDF](https://arxiv.org/pdf/2205.11775.pdf)\].

If you use this library, please cite:

``` title="bibtex"
@inproceedings{runje2023,
  title={Constrained Monotonic Neural Networks},
  author={Davor Runje and Sharath M. Shankaranarayana},
  booktitle={Proceedings of the 40th {International Conference on Machine Learning}},
  year={2023}
}
```

This package contains an implementation of our Monotonic Dense Layer
[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)
(Constrained Monotonic Fully Connected Layer). Below is the figure from
the paper for reference.

In the code, the variable `monotonicity_indicator` corresponds to **t**
in the figure and parameters `is_convex`, `is_concave` and
`activation_weights` are used to calculate the activation selector **s**
as follows:

- if `is_convex` or `is_concave` is **True**, then the activation
  selector **s** will be (`units`, 0, 0) and (0, `units`, 0),
  respecively.

- if both `is_convex` or `is_concave` is **False**, then the
  `activation_weights` represent ratios between $\breve{s}$, $\hat{s}$
  and $\tilde{s}$, respecively. E.g. if `activation_weights = (2, 2, 1)`
  and `units = 10`, then

$$
(\breve{s}, \hat{s}, \tilde{s}) = (4, 4, 2)
$$

![mono-dense-layer-diagram](https://github.com/airtai/mono-dense-keras/raw/main/nbs/images/mono-dense-layer-diagram.png)

## Install

``` sh
pip install mono-dense-keras
```

## How to use

In this example, we’ll assume we have a simple dataset with three inputs
values $x_1$, $x_2$ and $x_3$ sampled from the normal distribution,
while the output value $y$ is calculated according to the following
formula before adding Gaussian noise to it:

$y = x_1^3 + \sin\left(\frac{x_2}{2 \pi}\right) + e^{-x_3}$

<table id="T_6be9e">
  <thead>
    <tr>
      <th id="T_6be9e_level0_col0" class="col_heading level0 col0" >x0</th>
      <th id="T_6be9e_level0_col1" class="col_heading level0 col1" >x1</th>
      <th id="T_6be9e_level0_col2" class="col_heading level0 col2" >x2</th>
      <th id="T_6be9e_level0_col3" class="col_heading level0 col3" >y</th>
    </tr>
  </thead>
  <tbody>
    <tr>
      <td id="T_6be9e_row0_col0" class="data row0 col0" >0.304717</td>
      <td id="T_6be9e_row0_col1" class="data row0 col1" >-1.039984</td>
      <td id="T_6be9e_row0_col2" class="data row0 col2" >0.750451</td>
      <td id="T_6be9e_row0_col3" class="data row0 col3" >0.234541</td>
    </tr>
    <tr>
      <td id="T_6be9e_row1_col0" class="data row1 col0" >0.940565</td>
      <td id="T_6be9e_row1_col1" class="data row1 col1" >-1.951035</td>
      <td id="T_6be9e_row1_col2" class="data row1 col2" >-1.302180</td>
      <td id="T_6be9e_row1_col3" class="data row1 col3" >4.199094</td>
    </tr>
    <tr>
      <td id="T_6be9e_row2_col0" class="data row2 col0" >0.127840</td>
      <td id="T_6be9e_row2_col1" class="data row2 col1" >-0.316243</td>
      <td id="T_6be9e_row2_col2" class="data row2 col2" >-0.016801</td>
      <td id="T_6be9e_row2_col3" class="data row2 col3" >0.834086</td>
    </tr>
    <tr>
      <td id="T_6be9e_row3_col0" class="data row3 col0" >-0.853044</td>
      <td id="T_6be9e_row3_col1" class="data row3 col1" >0.879398</td>
      <td id="T_6be9e_row3_col2" class="data row3 col2" >0.777792</td>
      <td id="T_6be9e_row3_col3" class="data row3 col3" >-0.093359</td>
    </tr>
    <tr>
      <td id="T_6be9e_row4_col0" class="data row4 col0" >0.066031</td>
      <td id="T_6be9e_row4_col1" class="data row4 col1" >1.127241</td>
      <td id="T_6be9e_row4_col2" class="data row4 col2" >0.467509</td>
      <td id="T_6be9e_row4_col3" class="data row4 col3" >0.780875</td>
    </tr>
  </tbody>
</table>

Now, we’ll use the
[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)
layer instead of `Dense` layer to build a simple monotonic network. By
default, the
[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)
layer assumes the output of the layer is monotonically increasing with
all inputs. This assumtion is always true for all layers except possibly
the first one. For the first layer, we use `monotonicity_indicator` to
specify which input parameters are monotonic and to specify are they
increasingly or decreasingly monotonic:

- set 1 for increasingly monotonic parameter,

- set -1 for decreasingly monotonic parameter, and

- set 0 otherwise.

In our case, the `monotonicity_indicator` is `[1, 0, -1]` because $y$
is: - monotonically increasing w.r.t. $x_1$
$\left(\frac{\partial y}{x_1} = 3 {x_1}^2 \geq 0\right)$, and

- monotonically decreasing w.r.t. $x_3$
  $\left(\frac{\partial y}{x_3} = - e^{-x_2} \leq 0\right)$.

``` python
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense, Input

from mono_dense_keras import MonoDense

model = Sequential()

model.add(Input(shape=(3,)))
monotonicity_indicator = [1, 0, -1]
model.add(
    MonoDense(128, activation="elu", monotonicity_indicator=monotonicity_indicator)
)
model.add(MonoDense(128, activation="elu"))
model.add(MonoDense(1))

model.summary()
```

    Model: "sequential_7"
    _________________________________________________________________
     Layer (type)                Output Shape              Param #   
    =================================================================
     mono_dense_21 (MonoDense)   (None, 128)               512       
                                                                     
     mono_dense_22 (MonoDense)   (None, 128)               16512     
                                                                     
     mono_dense_23 (MonoDense)   (None, 1)                 129       
                                                                     
    =================================================================
    Total params: 17,153
    Trainable params: 17,153
    Non-trainable params: 0
    _________________________________________________________________

Now we can train the model as usual using `Model.fit`:

``` python
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.optimizers.schedules import ExponentialDecay

lr_schedule = ExponentialDecay(
    initial_learning_rate=0.01,
    decay_steps=10_000 // 32,
    decay_rate=0.9,
)
optimizer = Adam(learning_rate=lr_schedule)
model.compile(optimizer=optimizer, loss="mse")

model.fit(
    x=x_train, y=y_train, batch_size=32, validation_data=(x_val, y_val), epochs=10
)
```

    Epoch 1/10
    313/313 [==============================] - 2s 5ms/step - loss: 9.6909 - val_loss: 6.3050
    Epoch 2/10
    313/313 [==============================] - 1s 4ms/step - loss: 4.1970 - val_loss: 2.0028
    Epoch 3/10
    313/313 [==============================] - 1s 4ms/step - loss: 1.7086 - val_loss: 1.0551
    Epoch 4/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.9906 - val_loss: 0.5927
    Epoch 5/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.6411 - val_loss: 0.1694
    Epoch 6/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.6686 - val_loss: 1.7604
    Epoch 7/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.6464 - val_loss: 0.1079
    Epoch 8/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.4570 - val_loss: 0.1365
    Epoch 9/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.2945 - val_loss: 0.0664
    Epoch 10/10
    313/313 [==============================] - 1s 4ms/step - loss: 0.2095 - val_loss: 0.0849

    <keras.callbacks.History>

## License

<a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/"><img alt="Creative Commons Licence" style="border-width:0" src="https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png" /></a><br />This
work is licensed under a
<a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/4.0/">Creative
Commons Attribution-NonCommercial-ShareAlike 4.0 International
License</a>.

You are free to: - Share — copy and redistribute the material in any
medium or format

- Adapt — remix, transform, and build upon the material

The licensor cannot revoke these freedoms as long as you follow the
license terms.

Under the following terms: - Attribution — You must give appropriate
credit, provide a link to the license, and indicate if changes were
made. You may do so in any reasonable manner, but not in any way that
suggests the licensor endorses you or your use.

- NonCommercial — You may not use the material for commercial purposes.

- ShareAlike — If you remix, transform, or build upon the material, you
  must distribute your contributions under the same license as the
  original.

- No additional restrictions — You may not apply legal terms or
  technological measures that legally restrict others from doing
  anything the license permits.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/airtai/mono-dense-keras",
    "name": "mono-dense-keras",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "tensorflow monotone monotonic dense layer nbdev nbdev-mkdocs jupyter notebook python",
    "author": "AIRT Technologies d.o.o.",
    "author_email": "info@airt.ai",
    "download_url": "https://files.pythonhosted.org/packages/ec/a4/3c71c3fd83ae7814c9eb199069f555ea230a9bfe3c42bd741cb56f125c24/mono-dense-keras-0.1.0.tar.gz",
    "platform": null,
    "description": "Monotonic Dense Layer\n================\n\n<!-- WARNING: THIS FILE WAS AUTOGENERATED! DO NOT EDIT! -->\n\nThis Python library implements Monotonic Dense Layer as described in\nDavor Runje, Sharath M. Shankaranarayana, \u201cConstrained Monotonic Neural\nNetworks\u201d \\[[PDF](https://arxiv.org/pdf/2205.11775.pdf)\\].\n\nIf you use this library, please cite:\n\n``` title=\"bibtex\"\n@inproceedings{runje2023,\n  title={Constrained Monotonic Neural Networks},\n  author={Davor Runje and Sharath M. Shankaranarayana},\n  booktitle={Proceedings of the 40th {International Conference on Machine Learning}},\n  year={2023}\n}\n```\n\nThis package contains an implementation of our Monotonic Dense Layer\n[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)\n(Constrained Monotonic Fully Connected Layer). Below is the figure from\nthe paper for reference.\n\nIn the code, the variable `monotonicity_indicator` corresponds to **t**\nin the figure and parameters `is_convex`, `is_concave` and\n`activation_weights` are used to calculate the activation selector **s**\nas follows:\n\n- if `is_convex` or `is_concave` is **True**, then the activation\n  selector **s** will be (`units`, 0, 0) and (0, `units`, 0),\n  respecively.\n\n- if both `is_convex` or `is_concave` is **False**, then the\n  `activation_weights` represent ratios between $\\breve{s}$, $\\hat{s}$\n  and $\\tilde{s}$, respecively. E.g. if `activation_weights = (2, 2, 1)`\n  and `units = 10`, then\n\n$$\n(\\breve{s}, \\hat{s}, \\tilde{s}) = (4, 4, 2)\n$$\n\n![mono-dense-layer-diagram](https://github.com/airtai/mono-dense-keras/raw/main/nbs/images/mono-dense-layer-diagram.png)\n\n## Install\n\n``` sh\npip install mono-dense-keras\n```\n\n## How to use\n\nIn this example, we\u2019ll assume we have a simple dataset with three inputs\nvalues $x_1$, $x_2$ and $x_3$ sampled from the normal distribution,\nwhile the output value $y$ is calculated according to the following\nformula before adding Gaussian noise to it:\n\n$y = x_1^3 + \\sin\\left(\\frac{x_2}{2 \\pi}\\right) + e^{-x_3}$\n\n<table id=\"T_6be9e\">\n  <thead>\n    <tr>\n      <th id=\"T_6be9e_level0_col0\" class=\"col_heading level0 col0\" >x0</th>\n      <th id=\"T_6be9e_level0_col1\" class=\"col_heading level0 col1\" >x1</th>\n      <th id=\"T_6be9e_level0_col2\" class=\"col_heading level0 col2\" >x2</th>\n      <th id=\"T_6be9e_level0_col3\" class=\"col_heading level0 col3\" >y</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td id=\"T_6be9e_row0_col0\" class=\"data row0 col0\" >0.304717</td>\n      <td id=\"T_6be9e_row0_col1\" class=\"data row0 col1\" >-1.039984</td>\n      <td id=\"T_6be9e_row0_col2\" class=\"data row0 col2\" >0.750451</td>\n      <td id=\"T_6be9e_row0_col3\" class=\"data row0 col3\" >0.234541</td>\n    </tr>\n    <tr>\n      <td id=\"T_6be9e_row1_col0\" class=\"data row1 col0\" >0.940565</td>\n      <td id=\"T_6be9e_row1_col1\" class=\"data row1 col1\" >-1.951035</td>\n      <td id=\"T_6be9e_row1_col2\" class=\"data row1 col2\" >-1.302180</td>\n      <td id=\"T_6be9e_row1_col3\" class=\"data row1 col3\" >4.199094</td>\n    </tr>\n    <tr>\n      <td id=\"T_6be9e_row2_col0\" class=\"data row2 col0\" >0.127840</td>\n      <td id=\"T_6be9e_row2_col1\" class=\"data row2 col1\" >-0.316243</td>\n      <td id=\"T_6be9e_row2_col2\" class=\"data row2 col2\" >-0.016801</td>\n      <td id=\"T_6be9e_row2_col3\" class=\"data row2 col3\" >0.834086</td>\n    </tr>\n    <tr>\n      <td id=\"T_6be9e_row3_col0\" class=\"data row3 col0\" >-0.853044</td>\n      <td id=\"T_6be9e_row3_col1\" class=\"data row3 col1\" >0.879398</td>\n      <td id=\"T_6be9e_row3_col2\" class=\"data row3 col2\" >0.777792</td>\n      <td id=\"T_6be9e_row3_col3\" class=\"data row3 col3\" >-0.093359</td>\n    </tr>\n    <tr>\n      <td id=\"T_6be9e_row4_col0\" class=\"data row4 col0\" >0.066031</td>\n      <td id=\"T_6be9e_row4_col1\" class=\"data row4 col1\" >1.127241</td>\n      <td id=\"T_6be9e_row4_col2\" class=\"data row4 col2\" >0.467509</td>\n      <td id=\"T_6be9e_row4_col3\" class=\"data row4 col3\" >0.780875</td>\n    </tr>\n  </tbody>\n</table>\n\nNow, we\u2019ll use the\n[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)\nlayer instead of `Dense` layer to build a simple monotonic network. By\ndefault, the\n[`MonoDense`](https://mono-dense-keras.airt.ai/0.1.0/api/mono_dense_keras/MonoDense/#mono_dense_keras.MonoDense)\nlayer assumes the output of the layer is monotonically increasing with\nall inputs. This assumtion is always true for all layers except possibly\nthe first one. For the first layer, we use `monotonicity_indicator` to\nspecify which input parameters are monotonic and to specify are they\nincreasingly or decreasingly monotonic:\n\n- set 1 for increasingly monotonic parameter,\n\n- set -1 for decreasingly monotonic parameter, and\n\n- set 0 otherwise.\n\nIn our case, the `monotonicity_indicator` is `[1, 0, -1]` because $y$\nis: - monotonically increasing w.r.t. $x_1$\n$\\left(\\frac{\\partial y}{x_1} = 3 {x_1}^2 \\geq 0\\right)$, and\n\n- monotonically decreasing w.r.t. $x_3$\n  $\\left(\\frac{\\partial y}{x_3} = - e^{-x_2} \\leq 0\\right)$.\n\n``` python\nfrom tensorflow.keras import Sequential\nfrom tensorflow.keras.layers import Dense, Input\n\nfrom mono_dense_keras import MonoDense\n\nmodel = Sequential()\n\nmodel.add(Input(shape=(3,)))\nmonotonicity_indicator = [1, 0, -1]\nmodel.add(\n    MonoDense(128, activation=\"elu\", monotonicity_indicator=monotonicity_indicator)\n)\nmodel.add(MonoDense(128, activation=\"elu\"))\nmodel.add(MonoDense(1))\n\nmodel.summary()\n```\n\n    Model: \"sequential_7\"\n    _________________________________________________________________\n     Layer (type)                Output Shape              Param #   \n    =================================================================\n     mono_dense_21 (MonoDense)   (None, 128)               512       \n                                                                     \n     mono_dense_22 (MonoDense)   (None, 128)               16512     \n                                                                     \n     mono_dense_23 (MonoDense)   (None, 1)                 129       \n                                                                     \n    =================================================================\n    Total params: 17,153\n    Trainable params: 17,153\n    Non-trainable params: 0\n    _________________________________________________________________\n\nNow we can train the model as usual using `Model.fit`:\n\n``` python\nfrom tensorflow.keras.optimizers import Adam\nfrom tensorflow.keras.optimizers.schedules import ExponentialDecay\n\nlr_schedule = ExponentialDecay(\n    initial_learning_rate=0.01,\n    decay_steps=10_000 // 32,\n    decay_rate=0.9,\n)\noptimizer = Adam(learning_rate=lr_schedule)\nmodel.compile(optimizer=optimizer, loss=\"mse\")\n\nmodel.fit(\n    x=x_train, y=y_train, batch_size=32, validation_data=(x_val, y_val), epochs=10\n)\n```\n\n    Epoch 1/10\n    313/313 [==============================] - 2s 5ms/step - loss: 9.6909 - val_loss: 6.3050\n    Epoch 2/10\n    313/313 [==============================] - 1s 4ms/step - loss: 4.1970 - val_loss: 2.0028\n    Epoch 3/10\n    313/313 [==============================] - 1s 4ms/step - loss: 1.7086 - val_loss: 1.0551\n    Epoch 4/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.9906 - val_loss: 0.5927\n    Epoch 5/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.6411 - val_loss: 0.1694\n    Epoch 6/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.6686 - val_loss: 1.7604\n    Epoch 7/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.6464 - val_loss: 0.1079\n    Epoch 8/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.4570 - val_loss: 0.1365\n    Epoch 9/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.2945 - val_loss: 0.0664\n    Epoch 10/10\n    313/313 [==============================] - 1s 4ms/step - loss: 0.2095 - val_loss: 0.0849\n\n    <keras.callbacks.History>\n\n## License\n\n<a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc-sa/4.0/\"><img alt=\"Creative Commons Licence\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by-nc-sa/4.0/88x31.png\" /></a><br />This\nwork is licensed under a\n<a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc-sa/4.0/\">Creative\nCommons Attribution-NonCommercial-ShareAlike 4.0 International\nLicense</a>.\n\nYou are free to: - Share \u2014 copy and redistribute the material in any\nmedium or format\n\n- Adapt \u2014 remix, transform, and build upon the material\n\nThe licensor cannot revoke these freedoms as long as you follow the\nlicense terms.\n\nUnder the following terms: - Attribution \u2014 You must give appropriate\ncredit, provide a link to the license, and indicate if changes were\nmade. You may do so in any reasonable manner, but not in any way that\nsuggests the licensor endorses you or your use.\n\n- NonCommercial \u2014 You may not use the material for commercial purposes.\n\n- ShareAlike \u2014 If you remix, transform, or build upon the material, you\n  must distribute your contributions under the same license as the\n  original.\n\n- No additional restrictions \u2014 You may not apply legal terms or\n  technological measures that legally restrict others from doing\n  anything the license permits.\n",
    "bugtrack_url": null,
    "license": "Creative Commons License",
    "summary": "Monotonic Dense Layer implemented in Keras",
    "version": "0.1.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/airtai/mono-dense-keras/issues",
        "CI": "https://github.com/airtai/mono-dense-keras/actions",
        "Documentation": "https://mono-dense-keras.airt.ai/",
        "Homepage": "https://github.com/airtai/mono-dense-keras"
    },
    "split_keywords": [
        "tensorflow",
        "monotone",
        "monotonic",
        "dense",
        "layer",
        "nbdev",
        "nbdev-mkdocs",
        "jupyter",
        "notebook",
        "python"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "84838535eacd37c2c9f6def63e42d8ae26bb01dcc53ef43f698ed57955df4f5f",
                "md5": "049303d657c3c205c78994cb80d5ea51",
                "sha256": "60d1c71ea018e86b04a1dbe7025c86f4d2b70b5ca95f267f8d16d58ce7af3ef4"
            },
            "downloads": -1,
            "filename": "mono_dense_keras-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "049303d657c3c205c78994cb80d5ea51",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 23615,
            "upload_time": "2023-06-02T14:21:18",
            "upload_time_iso_8601": "2023-06-02T14:21:18.795756Z",
            "url": "https://files.pythonhosted.org/packages/84/83/8535eacd37c2c9f6def63e42d8ae26bb01dcc53ef43f698ed57955df4f5f/mono_dense_keras-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eca43c71c3fd83ae7814c9eb199069f555ea230a9bfe3c42bd741cb56f125c24",
                "md5": "47d467835511ca8d3e5ec3a96f60ad95",
                "sha256": "09454b7af14f6133822882198675df80be04f368bbf183db29be2e4d42d070e1"
            },
            "downloads": -1,
            "filename": "mono-dense-keras-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "47d467835511ca8d3e5ec3a96f60ad95",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 25925,
            "upload_time": "2023-06-02T14:21:20",
            "upload_time_iso_8601": "2023-06-02T14:21:20.927462Z",
            "url": "https://files.pythonhosted.org/packages/ec/a4/3c71c3fd83ae7814c9eb199069f555ea230a9bfe3c42bd741cb56f125c24/mono-dense-keras-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-02 14:21:20",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "airtai",
    "github_project": "mono-dense-keras",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mono-dense-keras"
}
        
Elapsed time: 0.09690s