Name | dks JSON |
Version |
0.1.2
JSON |
| download |
home_page | https://github.com/deepmind/dks |
Summary | A Python library implementing the DKS/TAT neural network transformation method. |
upload_time | 2023-01-30 23:19:15 |
maintainer | |
docs_url | None |
author | DeepMind |
requires_python | |
license | Apache 2.0 |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
![CI status](https://github.com/deepmind/dks/workflows/ci/badge.svg)
![pypi](https://img.shields.io/pypi/v/dks)
# Official Python package for Deep Kernel Shaping (DKS) and Tailored Activation Transformations (TAT)
This Python package implements the activation function transformations, weight
initializations, and dataset preprocessing used in Deep Kernel Shaping (DKS) and
Tailored Activation Transformations (TAT). DKS and TAT, which were introduced in
the [DKS paper] and [TAT paper], are methods for constructing/transforming
neural networks to make them much easier to train. For example, these methods
can be used in conjunction with K-FAC to train deep vanilla deep convnets
(without skip connections or normalization layers) as fast as standard ResNets
of the same depth.
The package supports the JAX, PyTorch, and TensorFlow tensor programming
frameworks.
Questions/comments about the code can be sent to
[dks-dev@google.com](mailto:dks-dev@google.com).
**NOTE:** we are not taking code contributions from Github at this time. All PRs
from Github will be rejected. Instead, please email us if you find a bug.
## Usage
For each of the supported tensor programming frameworks, there is a
corresponding subpackage which handles the activation function transformations,
weight initializations, and (optional) data preprocessing. (These are `dks.jax`,
`dks.pytorch`, and `dks.tensorflow`.) It's up to the user to import these and
use them appropriately within their model code. Activation functions are
transformed by the function `get_transformed_activations()` in the module
`activation_transform` of the appropriate subpackage. Sampling initial
parameters is done using functions inside of the module
`parameter_sampling_functions` of said subpackage. And data preprocessing is
done using the function `per_location_normalization` inside of the module
`data_preprocessing` of said subpackage. Note that in order to avoid having to
import all of the tensor programming frameworks, the user is required to
individually import whatever framework subpackage they want. e.g. `import
dks.jax`. Meanwhile, `import dks` won't actually do anything.
`get_transformed_activations()` requires the user to pass either the "maximal
slope function" for DKS, the "subnet maximizing function" for TAT with Leaky
ReLUs, or the "maximal curvature function" for TAT with smooth activation
functions. (The subnet maximizing function also handles DKS and TAT with smooth
activations.) These are special functions that encode information about the
particular model architecture. See the section titled "Summary of our method" of
the [DKS paper] for a procedure to construct the maximal slope function for a
given model, or the appendix section titled "Additional details and pseudocode
for activation function transformations" of the [TAT paper] for procedures to
construct the other two functions.
In addition to these things, the user is responsible for ensuring that their
model meets the architectural requirements of DKS/TAT, and for converting any
weighted sums into "normalized sums" (which are weighted sums whose
non-trainable weights have a sum of squares equal to 1). See the section titled
"Summary of our method" of the [DKS paper] for more details.
Note that the data preprocessing method implemented, called Per-Location
Normalization (PLN), may not always be needed in practice, but we have observed
certain situations where not using can lead to problems. (For example, training
on datasets that contain all-zero pixels, such as CIFAR-10.) Also
note that ReLUs are only partially supported by DKS, and unsupported by TAT, and
so their use is *highly* discouraged. Instead, one should use Leaky ReLUs, which
are fully supported by DKS, and work especially well with TAT.
## Example
`dks.examples.haiku.modified_resnet` is a [Haiku] ResNet model which has been
modified as described in the DKS/TAT papers, and includes support for both DKS
and TAT. When constructed with its default arguments, it removes the
normalization layers and skip connections found in standard ResNets, making it a
"vanilla network". It can be used as an instructive example for how to build
DKS/TAT models using this package. See the section titled "Application to
various modified ResNets" from the [DKS paper] for more details.
## Installation
This package can be installed directly from GitHub using `pip` with
```bash
pip install git+https://github.com/deepmind/dks.git
```
or
```bash
pip install -e git+https://github.com/deepmind/dks.git#egg=dks[<extras>]
```
Or from PyPI with
```bash
pip install dks
```
or
```bash
pip install dks[<extras>]
```
Here `<extras>` is a common-separated list of strings (with no spaces) that can
be passed to install extra dependencies for different tensor programming
frameworks. Valid strings are `jax`, `tf`, and `pytorch`. So for example, to
install `dks` with the extra requirements for JAX and PyTorch, one does
```bash
pip install dks[jax,pytorch]
```
## Testing
To run tests in a Python virtual environment with specific pinned versions of
all the dependencies one can do:
```bash
git clone https://github.com/deepmind/dks.git
cd dks
./test.sh
```
However, it is strongly recommended that you run the tests in the same Python
environment (with the same package versions) as you plan to actually use `dks`.
This can be accomplished by installing `dks` for all three tensors programming
frameworks (e.g. with `pip install dks[jax,pytorch,tf]` or some other
installation method), and then doing
```bash
pip install pytest-xdist
git clone https://github.com/deepmind/dks.git
cd dks
python -m pytest -n 16 tests
```
## Disclaimer
This is not an official Google product.
[DKS paper]: https://arxiv.org/abs/2110.01765
[TAT paper]: https://openreview.net/forum?id=U0k7XNTiFEq
[Haiku]: https://github.com/deepmind/dm-haiku
Raw data
{
"_id": null,
"home_page": "https://github.com/deepmind/dks",
"name": "dks",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "",
"author": "DeepMind",
"author_email": "dks-dev@google.com",
"download_url": "https://files.pythonhosted.org/packages/e2/ad/3e3958f6551ce272f652e86191293f9fdbcda0eb2d84bec7545541ac2309/dks-0.1.2.tar.gz",
"platform": null,
"description": "![CI status](https://github.com/deepmind/dks/workflows/ci/badge.svg)\n![pypi](https://img.shields.io/pypi/v/dks)\n\n# Official Python package for Deep Kernel Shaping (DKS) and Tailored Activation Transformations (TAT)\n\nThis Python package implements the activation function transformations, weight\ninitializations, and dataset preprocessing used in Deep Kernel Shaping (DKS) and\nTailored Activation Transformations (TAT). DKS and TAT, which were introduced in\nthe [DKS paper] and [TAT paper], are methods for constructing/transforming\nneural networks to make them much easier to train. For example, these methods\ncan be used in conjunction with K-FAC to train deep vanilla deep convnets\n(without skip connections or normalization layers) as fast as standard ResNets\nof the same depth.\n\nThe package supports the JAX, PyTorch, and TensorFlow tensor programming\nframeworks.\n\nQuestions/comments about the code can be sent to\n[dks-dev@google.com](mailto:dks-dev@google.com).\n\n**NOTE:** we are not taking code contributions from Github at this time. All PRs\nfrom Github will be rejected. Instead, please email us if you find a bug.\n\n## Usage\n\nFor each of the supported tensor programming frameworks, there is a\ncorresponding subpackage which handles the activation function transformations,\nweight initializations, and (optional) data preprocessing. (These are `dks.jax`,\n`dks.pytorch`, and `dks.tensorflow`.) It's up to the user to import these and\nuse them appropriately within their model code. Activation functions are\ntransformed by the function `get_transformed_activations()` in the module\n`activation_transform` of the appropriate subpackage. Sampling initial\nparameters is done using functions inside of the module\n`parameter_sampling_functions` of said subpackage. And data preprocessing is\ndone using the function `per_location_normalization` inside of the module\n`data_preprocessing` of said subpackage. Note that in order to avoid having to\nimport all of the tensor programming frameworks, the user is required to\nindividually import whatever framework subpackage they want. e.g. `import\ndks.jax`. Meanwhile, `import dks` won't actually do anything.\n\n`get_transformed_activations()` requires the user to pass either the \"maximal\nslope function\" for DKS, the \"subnet maximizing function\" for TAT with Leaky\nReLUs, or the \"maximal curvature function\" for TAT with smooth activation\nfunctions. (The subnet maximizing function also handles DKS and TAT with smooth\nactivations.) These are special functions that encode information about the\nparticular model architecture. See the section titled \"Summary of our method\" of\nthe [DKS paper] for a procedure to construct the maximal slope function for a\ngiven model, or the appendix section titled \"Additional details and pseudocode\nfor activation function transformations\" of the [TAT paper] for procedures to\nconstruct the other two functions.\n\nIn addition to these things, the user is responsible for ensuring that their\nmodel meets the architectural requirements of DKS/TAT, and for converting any\nweighted sums into \"normalized sums\" (which are weighted sums whose\nnon-trainable weights have a sum of squares equal to 1). See the section titled\n\"Summary of our method\" of the [DKS paper] for more details.\n\nNote that the data preprocessing method implemented, called Per-Location \nNormalization (PLN), may not always be needed in practice, but we have observed\ncertain situations where not using can lead to problems. (For example, training\non datasets that contain all-zero pixels, such as CIFAR-10.) Also\nnote that ReLUs are only partially supported by DKS, and unsupported by TAT, and\nso their use is *highly* discouraged. Instead, one should use Leaky ReLUs, which\nare fully supported by DKS, and work especially well with TAT.\n\n## Example\n\n`dks.examples.haiku.modified_resnet` is a [Haiku] ResNet model which has been\nmodified as described in the DKS/TAT papers, and includes support for both DKS\nand TAT. When constructed with its default arguments, it removes the\nnormalization layers and skip connections found in standard ResNets, making it a\n\"vanilla network\". It can be used as an instructive example for how to build\nDKS/TAT models using this package. See the section titled \"Application to\nvarious modified ResNets\" from the [DKS paper] for more details.\n\n## Installation\n\nThis package can be installed directly from GitHub using `pip` with\n\n```bash\npip install git+https://github.com/deepmind/dks.git\n```\n\nor\n\n```bash\npip install -e git+https://github.com/deepmind/dks.git#egg=dks[<extras>]\n```\n\nOr from PyPI with\n\n```bash\npip install dks\n```\n\nor\n\n```bash\npip install dks[<extras>]\n```\n\nHere `<extras>` is a common-separated list of strings (with no spaces) that can\nbe passed to install extra dependencies for different tensor programming\nframeworks. Valid strings are `jax`, `tf`, and `pytorch`. So for example, to\ninstall `dks` with the extra requirements for JAX and PyTorch, one does\n\n```bash\npip install dks[jax,pytorch]\n```\n\n## Testing\n\nTo run tests in a Python virtual environment with specific pinned versions of\nall the dependencies one can do:\n\n```bash\ngit clone https://github.com/deepmind/dks.git\ncd dks\n./test.sh\n```\n\nHowever, it is strongly recommended that you run the tests in the same Python\nenvironment (with the same package versions) as you plan to actually use `dks`.\nThis can be accomplished by installing `dks` for all three tensors programming\nframeworks (e.g. with `pip install dks[jax,pytorch,tf]` or some other\ninstallation method), and then doing\n\n```bash\npip install pytest-xdist\ngit clone https://github.com/deepmind/dks.git\ncd dks\npython -m pytest -n 16 tests\n```\n\n## Disclaimer\n\nThis is not an official Google product.\n\n[DKS paper]: https://arxiv.org/abs/2110.01765\n[TAT paper]: https://openreview.net/forum?id=U0k7XNTiFEq\n[Haiku]: https://github.com/deepmind/dm-haiku\n",
"bugtrack_url": null,
"license": "Apache 2.0",
"summary": "A Python library implementing the DKS/TAT neural network transformation method.",
"version": "0.1.2",
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a43a210ee90cece1f233d16795c14c513b8964750865a9040e769253d5573883",
"md5": "e4af540f63f07837beda5b453a6209bc",
"sha256": "14e7ccf6300371f069f783647f6228e066e8f21124003334b395f17c23be668f"
},
"downloads": -1,
"filename": "dks-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e4af540f63f07837beda5b453a6209bc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 1296304,
"upload_time": "2023-01-30T23:19:13",
"upload_time_iso_8601": "2023-01-30T23:19:13.583369Z",
"url": "https://files.pythonhosted.org/packages/a4/3a/210ee90cece1f233d16795c14c513b8964750865a9040e769253d5573883/dks-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "e2ad3e3958f6551ce272f652e86191293f9fdbcda0eb2d84bec7545541ac2309",
"md5": "7855d4861cddbdbe94fe70b5129d7865",
"sha256": "9db9ebd670afd535962ed6d3593a9a125b0286749a64660e22693cd2ff3f2508"
},
"downloads": -1,
"filename": "dks-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "7855d4861cddbdbe94fe70b5129d7865",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 1283817,
"upload_time": "2023-01-30T23:19:15",
"upload_time_iso_8601": "2023-01-30T23:19:15.420791Z",
"url": "https://files.pythonhosted.org/packages/e2/ad/3e3958f6551ce272f652e86191293f9fdbcda0eb2d84bec7545541ac2309/dks-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-01-30 23:19:15",
"github": true,
"gitlab": false,
"bitbucket": false,
"github_user": "deepmind",
"github_project": "dks",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "dks"
}