rfflearn


Namerfflearn JSON
Version 2024.10.27 PyPI version JSON
download
home_pageNone
SummaryImplementation of random Fourier features for kernel method, like support vector machine and Gaussian process model
upload_time2024-10-27 09:09:28
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT License Copyright (c) 2018 Tetsuya Ishikawa Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords
VCS
bugtrack_url
requirements numpy scipy pandas scikit-learn torch optuna shap matplotlib
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
  <img src="./figures/logo_long.svg" width="480" alt="rfflearn logo" />
</div>

<div align="center">
  <img src="https://img.shields.io/badge/PYTHON-%3E%3D3.9-blue.svg?logo=python&logoColor=white">
  &nbsp;
  <img src="https://img.shields.io/badge/LICENSE-MIT-orange.svg">
  &nbsp;
  <img src="https://img.shields.io/badge/COVERAGE-97%25-green.svg">
  &nbsp;
  <img src="https://img.shields.io/badge/QUALITY-9.96/10.0-yellow.svg">
</div>


Random Fourier Features
====================================================================================================

This repository provides the Python module `rfflearn` which is a Python library of random Fourier
features (hereinafter abbreviated as RFF) [1, 2] for kernel methods, like support vector
machine [3, 4] and Gaussian process model [5]. Features of this module are:

- **User-friendly interfaces**: Interfaces of the `rfflearn` module are quite close to
  the [scikit-learn](https://scikit-learn.org/) library,
- **Example code first**: This repository provides plenty of [example code](./examples/) to
  demonstrate that RFF is useful for actual machine learning tasks,
- **GPU support**: Some classes in the `rfflearn` module provides both GPU training and inference
  for faster computation,
- **Wrapper to the other library**: Interface to [Optuna](https://optuna.org/) and
  [SHAP](https://github.com/slundberg/shap) are provided for easier hyperparameter tuning and
  feature importance analysis.

Now, this module supports the following methods:

| Method                          | CPU support                  | GPU support           |
| ------------------------------- | ---------------------------- | --------------------- |
| canonical correlation analysis  | `rfflearn.cpu.RFFCCA`        | -                     |
| Gaussian process regression     | `rfflearn.cpu.RFFGPR`        | `rfflearn.gpu.RFFGPR` |
| Gaussian process classification | `rfflearn.cpu.RFFGPC`        | `rfflearn.gpu.RFFGPC` |
| principal component analysis    | `rfflearn.cpu.RFFPCA`        | `rfflearn.gpu.RFFPCA` |
| regression                      | `rfflearn.cpu.RFFRegression` | -                     |
| support vector classification   | `rfflearn.cpu.RFFSVC`        | `rfflearn.gpu.RFFSVC` |
| support vector regression       | `rfflearn.cpu.RFFSVR`        | -                     |

RFF can be applicable to many other machine learning algorithms than the above.
The author will provide implementations of the other algorithms soon.

**NOTE**: The author confirmed that the `rfflearn` module works on PyTorch 2.0! :tada:


Installation
----------------------------------------------------------------------------------------------------

Please install from PyPI.

```shell
pip install rfflearn
```

If you need GPU support, Optuna support or SHAP support,
you need to install the following packages.

```shell
# For GPU support.
pip install torch

# For Optuna support.
pip install optuna

# For SHAP support.
pip install matplotlib shap
```

The author recommends using [venv](https://docs.python.org/3/library/venv.html) or
[Docker](https://www.docker.com/) to avoid polluting your environment.


Minimal example
----------------------------------------------------------------------------------------------------

Interfaces provided by the module `rfflearn` are quite close to scikit-learn.
For example, the following Python code is a sample usage of `RFFSVC`
(support vector machine with random Fourier features) class.

```python
>>> import numpy as np
>>> import rfflearn.cpu as rfflearn                     # Import module
>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data
>>> y = np.array([1, 1, 2, 2])                          # Defile label data
>>> svc = rfflearn.RFFSVC().fit(X, y)                   # Training (on CPU)
>>> svc.score(X, y)                                     # Inference (on CPU)
1.0
>>> svc.predict(np.array([[-0.8, -1]]))
array([1])
```

This module supports training/inference on GPU.
For example, the following Python code is a sample usage of `RFFGPC`
(Gaussian process classifier with random Fourier features) on GPU.
The following code requires PyTorch (>= 1.7.0).

```python
>>> import numpy as np
>>> import rfflearn.gpu as rfflearn                     # Import module
>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data
>>> y = np.array([1, 1, 2, 2])                          # Defile label data
>>> gpc = rfflearn.RFFGPC().fit(X, y)                   # Training on GPU
>>> gpc.score(X, y)                                     # Inference on GPU
1.0
>>> gpc.predict(np.array([[-0.8, -1]]))
array([1])
```

See the [examples](./examples/) directory for more detailed examples.


Example1: MNIST using random Fourier features
----------------------------------------------------------------------------------------------------

The author tried SVC (support vector classifier) and GPC (Gaussian process classifier) with RFF to
the MNIST dataset which is one of the famous benchmark datasets on the image classification task,
and the author has got better performance and much faster inference speed than kernel SVM.
The following table gives a brief comparison of kernel SVM, SVM with RFF, and GPC with RFF. 
See the example of [RFF SVC module](./examples/svc_for_mnist/) and
[RFF GP module](./examples/gpc_for_mnist/) for more details.

| Method         | RFF dimension | Inference time [us/image] | Score [%] |
|:--------------:|:-------------:|:-------------------------:|:---------:|
| Kernel SVM     |      -        | 1312.6 us                 | 96.30 %   |
| RFF SVC        |    640        |   33.6 us                 | 96.39 %   |
| RFF SVC (GPU)  |    640        |   1.11 us                 | 96.39 %   |
| RFF SVC        |  4,096        |  183.4 us                 | 98.14 %   |
| RFF SVC (GPU)  |  4,096        |   2.62 us                 | 98.14 %   |
| RFF GPC        | 20,000        |  517.9 us                 | 98.38 %   |
| RFF GPC (GPU)  | 20,000        |  61.52 us                 | 98.38 %   |

<div align="center">
  <img src="./figures/Inference_time_and_acc_on_MNIST.svg" width="640" alt="Accuracy for each epochs in RFF SVC/GPC" />
</div>


Example2: Visualization of feature importance
----------------------------------------------------------------------------------------------------

This module also has interfaces to some feature importance methods, like SHAP [6] and permutation
importance [7]. The author tried SHAP and permutation importance to `RFFGPR` trained on
the California housing dataset, and the followings are the visualization results obtained by
`rfflearn.shap_feature_importance` and `rfflearn.permutation_feature_importance`.

<div align="center">
  <img src="./examples/feature_importances_for_california_housing/figures/figure_california_housing_shap_importance.png" width="450" alt="Permutation importances of Boston housing dataset" />
  <img src="./examples/feature_importances_for_california_housing/figures/figure_california_housing_permutation_importance.png" width="450" alt="SHAP importances of Boston housing dataset" />
</div>


Quick Tutorial
----------------------------------------------------------------------------------------------------

At first, please clone the `random-fourier-features` repository from GitHub:

```console
git clone https://github.com/tiskw/random-fourier-features.git
cd random-fourier-features
```

If you are using the docker image, enter into the docker container by the following command
(not necessary to run thw following if you don't use docker):

```console
docker run --rm -it --gpus all -v `pwd`:/work -w /work -u `id -u`:`id -g` tiskw/pytorch:latest bash
```

Then, launch python3 and try the following minimal code that runs support vector classification
with random Fourier features on an artificial tiny dataset.

```python
>>> import numpy as np                                  # Import Numpy
>>> import rfflearn.cpu as rfflearn                     # Import our module
>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data
>>> y = np.array([1, 1, 2, 2])                          # Defile label data
>>> svc = rfflearn.RFFSVC().fit(X, y)                   # Training
>>> svc.score(X, y)                                     # Inference (on CPU)
1.0
>>> svc.predict(np.array([[-0.8, -1]]))
array([1])
```

### Next Step

Now you succeeded in installing the `rfflearn` module.
The author's recommendation for the next step is to see the [examples directory](/examples)
and try a code you are interested in.


Notes
----------------------------------------------------------------------------------------------------

- The name of this module is changed from `pyrff` to `rfflearn` on Oct 2020, because the package
  name `pyrff` already exists in PyPI.
- If the number of training data is huge, an error message like `RuntimeError: The task could not be
  sent to the workers as it is too large for 'send_bytes'` will be raised from the joblib library.
  The reason for this error is that the `sklearn.svm.LinearSVC` uses `joblib` as a multiprocessing
  backend, but joblib cannot deal huge size of the array which cannot be managed with 32-bit
  address space. In this case, please try `n_jobs = 1` option for the `RFFSVC` or `ORFSVC` function.
  Default settings are `n_jobs = -1` which means automatically detecting available CPUs and using
  them. (This bug information was reported by Mr. Katsuya Terahata @ Toyota Research Institute
  Advanced Development. Thank you so much for the reporting!)
- Application of RFF to the Gaussian process model is not straightforward.
  See [this document](./articles/rff_for_gaussian_process.pdf) for mathematical details.


License
----------------------------------------------------------------------------------------------------

This software is distributed under the [MIT Licence](https://opensource.org/licenses/mit-license.php).
See [LICENSE](./LICENSE) for more information.


Reference
----------------------------------------------------------------------------------------------------

[1] A. Rahimi and B. Recht, "Random Features for Large-Scale Kernel Machines", NIPS, 2007.
[PDF](https://papers.nips.cc/paper/3182-random-features-for-large-scale-kernel-machines.pdf)

[2] F. X. Yu, A. T. Suresh, K. Choromanski, D. Holtmann-Rice and S. Kumar, "Orthogonal Random Features", NIPS, 2016.
[PDF](https://papers.nips.cc/paper/6246-orthogonal-random-features.pdf)

[3] V. Vapnik and A. Lerner, "Pattern recognition using generalized portrait method", Automation and Remote Control, vol. 24, 1963.

[4] B. Boser, I. Guyon and V. Vapnik, "A training algorithm for optimal margin classifiers", COLT, pp. 144-152, 1992
[URL](https://dl.acm.org/doi/10.1145/130385.130401)

[5] C. Rasmussen and C. Williams, "Gaussian Processes for Machine Learning", MIT Press, 2006.

[6] S. M. Lundberg and S. Lee, "A Unified Approach to Interpreting Model Predictions", NIPS, 2017.
[PDF](https://proceedings.neurips.cc/paper/2017/file/8a20a8621978632d76c43dfd28b67767-Paper.pdf)

[7] L. Breiman, "Random Forests", Machine Learning, vol. 45, pp. 5-32, Springer, 2001.
[Springer website](https://doi.org/10.1023/A:1010933404324).


Author
----------------------------------------------------------------------------------------------------

Tetsuya Ishikawa ([EMail](mailto:tiskw111@gmail.com), [Website](https://tiskw.github.io/about_en.html))

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "rfflearn",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "Tetsuya Ishikawa <tiskw111@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/a8/23/59ed711e45557a56860e07a5d93e67c6f672b73ad698e01b798865c93110/rfflearn-2024.10.27.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n  <img src=\"./figures/logo_long.svg\" width=\"480\" alt=\"rfflearn logo\" />\n</div>\n\n<div align=\"center\">\n  <img src=\"https://img.shields.io/badge/PYTHON-%3E%3D3.9-blue.svg?logo=python&logoColor=white\">\n  &nbsp;\n  <img src=\"https://img.shields.io/badge/LICENSE-MIT-orange.svg\">\n  &nbsp;\n  <img src=\"https://img.shields.io/badge/COVERAGE-97%25-green.svg\">\n  &nbsp;\n  <img src=\"https://img.shields.io/badge/QUALITY-9.96/10.0-yellow.svg\">\n</div>\n\n\nRandom Fourier Features\n====================================================================================================\n\nThis repository provides the Python module `rfflearn` which is a Python library of random Fourier\nfeatures (hereinafter abbreviated as RFF) [1, 2] for kernel methods, like support vector\nmachine [3, 4] and Gaussian process model [5]. Features of this module are:\n\n- **User-friendly interfaces**: Interfaces of the `rfflearn` module are quite close to\n  the [scikit-learn](https://scikit-learn.org/) library,\n- **Example code first**: This repository provides plenty of [example code](./examples/) to\n  demonstrate that RFF is useful for actual machine learning tasks,\n- **GPU support**: Some classes in the `rfflearn` module provides both GPU training and inference\n  for faster computation,\n- **Wrapper to the other library**: Interface to [Optuna](https://optuna.org/) and\n  [SHAP](https://github.com/slundberg/shap) are provided for easier hyperparameter tuning and\n  feature importance analysis.\n\nNow, this module supports the following methods:\n\n| Method                          | CPU support                  | GPU support           |\n| ------------------------------- | ---------------------------- | --------------------- |\n| canonical correlation analysis  | `rfflearn.cpu.RFFCCA`        | -                     |\n| Gaussian process regression     | `rfflearn.cpu.RFFGPR`        | `rfflearn.gpu.RFFGPR` |\n| Gaussian process classification | `rfflearn.cpu.RFFGPC`        | `rfflearn.gpu.RFFGPC` |\n| principal component analysis    | `rfflearn.cpu.RFFPCA`        | `rfflearn.gpu.RFFPCA` |\n| regression                      | `rfflearn.cpu.RFFRegression` | -                     |\n| support vector classification   | `rfflearn.cpu.RFFSVC`        | `rfflearn.gpu.RFFSVC` |\n| support vector regression       | `rfflearn.cpu.RFFSVR`        | -                     |\n\nRFF can be applicable to many other machine learning algorithms than the above.\nThe author will provide implementations of the other algorithms soon.\n\n**NOTE**: The author confirmed that the `rfflearn` module works on PyTorch 2.0! :tada:\n\n\nInstallation\n----------------------------------------------------------------------------------------------------\n\nPlease install from PyPI.\n\n```shell\npip install rfflearn\n```\n\nIf you need GPU support, Optuna support or SHAP support,\nyou need to install the following packages.\n\n```shell\n# For GPU support.\npip install torch\n\n# For Optuna support.\npip install optuna\n\n# For SHAP support.\npip install matplotlib shap\n```\n\nThe author recommends using [venv](https://docs.python.org/3/library/venv.html) or\n[Docker](https://www.docker.com/) to avoid polluting your environment.\n\n\nMinimal example\n----------------------------------------------------------------------------------------------------\n\nInterfaces provided by the module `rfflearn` are quite close to scikit-learn.\nFor example, the following Python code is a sample usage of `RFFSVC`\n(support vector machine with random Fourier features) class.\n\n```python\n>>> import numpy as np\n>>> import rfflearn.cpu as rfflearn                     # Import module\n>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data\n>>> y = np.array([1, 1, 2, 2])                          # Defile label data\n>>> svc = rfflearn.RFFSVC().fit(X, y)                   # Training (on CPU)\n>>> svc.score(X, y)                                     # Inference (on CPU)\n1.0\n>>> svc.predict(np.array([[-0.8, -1]]))\narray([1])\n```\n\nThis module supports training/inference on GPU.\nFor example, the following Python code is a sample usage of `RFFGPC`\n(Gaussian process classifier with random Fourier features) on GPU.\nThe following code requires PyTorch (>= 1.7.0).\n\n```python\n>>> import numpy as np\n>>> import rfflearn.gpu as rfflearn                     # Import module\n>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data\n>>> y = np.array([1, 1, 2, 2])                          # Defile label data\n>>> gpc = rfflearn.RFFGPC().fit(X, y)                   # Training on GPU\n>>> gpc.score(X, y)                                     # Inference on GPU\n1.0\n>>> gpc.predict(np.array([[-0.8, -1]]))\narray([1])\n```\n\nSee the [examples](./examples/) directory for more detailed examples.\n\n\nExample1: MNIST using random Fourier features\n----------------------------------------------------------------------------------------------------\n\nThe author tried SVC (support vector classifier) and GPC (Gaussian process classifier) with RFF to\nthe MNIST dataset which is one of the famous benchmark datasets on the image classification task,\nand the author has got better performance and much faster inference speed than kernel SVM.\nThe following table gives a brief comparison of kernel SVM, SVM with RFF, and GPC with RFF. \nSee the example of [RFF SVC module](./examples/svc_for_mnist/) and\n[RFF GP module](./examples/gpc_for_mnist/) for more details.\n\n| Method         | RFF dimension | Inference time [us/image] | Score [%] |\n|:--------------:|:-------------:|:-------------------------:|:---------:|\n| Kernel SVM     |      -        | 1312.6 us                 | 96.30 %   |\n| RFF SVC        |    640        |   33.6 us                 | 96.39 %   |\n| RFF SVC (GPU)  |    640        |   1.11 us                 | 96.39 %   |\n| RFF SVC        |  4,096        |  183.4 us                 | 98.14 %   |\n| RFF SVC (GPU)  |  4,096        |   2.62 us                 | 98.14 %   |\n| RFF GPC        | 20,000        |  517.9 us                 | 98.38 %   |\n| RFF GPC (GPU)  | 20,000        |  61.52 us                 | 98.38 %   |\n\n<div align=\"center\">\n  <img src=\"./figures/Inference_time_and_acc_on_MNIST.svg\" width=\"640\" alt=\"Accuracy for each epochs in RFF SVC/GPC\" />\n</div>\n\n\nExample2: Visualization of feature importance\n----------------------------------------------------------------------------------------------------\n\nThis module also has interfaces to some feature importance methods, like SHAP [6] and permutation\nimportance [7]. The author tried SHAP and permutation importance to `RFFGPR` trained on\nthe California housing dataset, and the followings are the visualization results obtained by\n`rfflearn.shap_feature_importance` and `rfflearn.permutation_feature_importance`.\n\n<div align=\"center\">\n  <img src=\"./examples/feature_importances_for_california_housing/figures/figure_california_housing_shap_importance.png\" width=\"450\" alt=\"Permutation importances of Boston housing dataset\" />\n  <img src=\"./examples/feature_importances_for_california_housing/figures/figure_california_housing_permutation_importance.png\" width=\"450\" alt=\"SHAP importances of Boston housing dataset\" />\n</div>\n\n\nQuick Tutorial\n----------------------------------------------------------------------------------------------------\n\nAt first, please clone the `random-fourier-features` repository from GitHub:\n\n```console\ngit clone https://github.com/tiskw/random-fourier-features.git\ncd random-fourier-features\n```\n\nIf you are using the docker image, enter into the docker container by the following command\n(not necessary to run thw following if you don't use docker):\n\n```console\ndocker run --rm -it --gpus all -v `pwd`:/work -w /work -u `id -u`:`id -g` tiskw/pytorch:latest bash\n```\n\nThen, launch python3 and try the following minimal code that runs support vector classification\nwith random Fourier features on an artificial tiny dataset.\n\n```python\n>>> import numpy as np                                  # Import Numpy\n>>> import rfflearn.cpu as rfflearn                     # Import our module\n>>> X = np.array([[-1, -1], [-2, -1], [1, 1], [2, 1]])  # Define input data\n>>> y = np.array([1, 1, 2, 2])                          # Defile label data\n>>> svc = rfflearn.RFFSVC().fit(X, y)                   # Training\n>>> svc.score(X, y)                                     # Inference (on CPU)\n1.0\n>>> svc.predict(np.array([[-0.8, -1]]))\narray([1])\n```\n\n### Next Step\n\nNow you succeeded in installing the `rfflearn` module.\nThe author's recommendation for the next step is to see the [examples directory](/examples)\nand try a code you are interested in.\n\n\nNotes\n----------------------------------------------------------------------------------------------------\n\n- The name of this module is changed from `pyrff` to `rfflearn` on Oct 2020, because the package\n  name `pyrff` already exists in PyPI.\n- If the number of training data is huge, an error message like `RuntimeError: The task could not be\n  sent to the workers as it is too large for 'send_bytes'` will be raised from the joblib library.\n  The reason for this error is that the `sklearn.svm.LinearSVC` uses `joblib` as a multiprocessing\n  backend, but joblib cannot deal huge size of the array which cannot be managed with 32-bit\n  address space. In this case, please try `n_jobs = 1` option for the `RFFSVC` or `ORFSVC` function.\n  Default settings are `n_jobs = -1` which means automatically detecting available CPUs and using\n  them. (This bug information was reported by Mr. Katsuya Terahata @ Toyota Research Institute\n  Advanced Development. Thank you so much for the reporting!)\n- Application of RFF to the Gaussian process model is not straightforward.\n  See [this document](./articles/rff_for_gaussian_process.pdf) for mathematical details.\n\n\nLicense\n----------------------------------------------------------------------------------------------------\n\nThis software is distributed under the [MIT Licence](https://opensource.org/licenses/mit-license.php).\nSee [LICENSE](./LICENSE) for more information.\n\n\nReference\n----------------------------------------------------------------------------------------------------\n\n[1] A. Rahimi and B. Recht, \"Random Features for Large-Scale Kernel Machines\", NIPS, 2007.\n[PDF](https://papers.nips.cc/paper/3182-random-features-for-large-scale-kernel-machines.pdf)\n\n[2] F. X. Yu, A. T. Suresh, K. Choromanski, D. Holtmann-Rice and S. Kumar, \"Orthogonal Random Features\", NIPS, 2016.\n[PDF](https://papers.nips.cc/paper/6246-orthogonal-random-features.pdf)\n\n[3] V. Vapnik and A. Lerner, \"Pattern recognition using generalized portrait method\", Automation and Remote Control, vol. 24, 1963.\n\n[4] B. Boser, I. Guyon and V. Vapnik, \"A training algorithm for optimal margin classifiers\", COLT, pp. 144-152, 1992\n[URL](https://dl.acm.org/doi/10.1145/130385.130401)\n\n[5] C. Rasmussen and C. Williams, \"Gaussian Processes for Machine Learning\", MIT Press, 2006.\n\n[6] S. M. Lundberg and S. Lee, \"A Unified Approach to Interpreting Model Predictions\", NIPS, 2017.\n[PDF](https://proceedings.neurips.cc/paper/2017/file/8a20a8621978632d76c43dfd28b67767-Paper.pdf)\n\n[7] L. Breiman, \"Random Forests\", Machine Learning, vol. 45, pp. 5-32, Springer, 2001.\n[Springer website](https://doi.org/10.1023/A:1010933404324).\n\n\nAuthor\n----------------------------------------------------------------------------------------------------\n\nTetsuya Ishikawa ([EMail](mailto:tiskw111@gmail.com), [Website](https://tiskw.github.io/about_en.html))\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2018 Tetsuya Ishikawa  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
    "summary": "Implementation of random Fourier features for kernel method, like support vector machine and Gaussian process model",
    "version": "2024.10.27",
    "project_urls": {
        "Issues": "https://github.com/tiskw/random-fourier-features/issues",
        "Repository": "https://github.com/tiskw/random-fourier-features"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "207edb2245cfe6e3ca636b99266785a1c8b946cdbeb9fc0abc9d3328e23a5d18",
                "md5": "9e676e290e233d5867b110a608578d14",
                "sha256": "fcd6181668ed85fc3187b5933231a2485dd305626204c4d1cef25fe49c30198c"
            },
            "downloads": -1,
            "filename": "rfflearn-2024.10.27-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "9e676e290e233d5867b110a608578d14",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 37515,
            "upload_time": "2024-10-27T09:09:21",
            "upload_time_iso_8601": "2024-10-27T09:09:21.799544Z",
            "url": "https://files.pythonhosted.org/packages/20/7e/db2245cfe6e3ca636b99266785a1c8b946cdbeb9fc0abc9d3328e23a5d18/rfflearn-2024.10.27-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a82359ed711e45557a56860e07a5d93e67c6f672b73ad698e01b798865c93110",
                "md5": "c1b9d95ab9096691d10bdfbf45339b33",
                "sha256": "48e9e8da663e2cd0347c8ecb1df0ebd4fb8bd0c888b9ab70dfa41f28fa3630ee"
            },
            "downloads": -1,
            "filename": "rfflearn-2024.10.27.tar.gz",
            "has_sig": false,
            "md5_digest": "c1b9d95ab9096691d10bdfbf45339b33",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 2222261,
            "upload_time": "2024-10-27T09:09:28",
            "upload_time_iso_8601": "2024-10-27T09:09:28.517940Z",
            "url": "https://files.pythonhosted.org/packages/a8/23/59ed711e45557a56860e07a5d93e67c6f672b73ad698e01b798865c93110/rfflearn-2024.10.27.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-27 09:09:28",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "tiskw",
    "github_project": "random-fourier-features",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "numpy",
            "specs": [
                [
                    ">=",
                    "1.19.0"
                ]
            ]
        },
        {
            "name": "scipy",
            "specs": [
                [
                    ">=",
                    "1.6.0"
                ]
            ]
        },
        {
            "name": "pandas",
            "specs": [
                [
                    ">=",
                    "1.2.1"
                ]
            ]
        },
        {
            "name": "scikit-learn",
            "specs": [
                [
                    ">=",
                    "0.24.0"
                ]
            ]
        },
        {
            "name": "torch",
            "specs": [
                [
                    ">=",
                    "1.7.0"
                ]
            ]
        },
        {
            "name": "optuna",
            "specs": [
                [
                    ">=",
                    "2.3.0"
                ]
            ]
        },
        {
            "name": "shap",
            "specs": [
                [
                    ">=",
                    "0.37.0"
                ]
            ]
        },
        {
            "name": "matplotlib",
            "specs": [
                [
                    ">=",
                    "3.3.3"
                ]
            ]
        }
    ],
    "lcname": "rfflearn"
}
        
Elapsed time: 0.37996s