TfELM


NameTfELM JSON
Version 1.1 PyPI version JSON
download
home_pageNone
SummaryThis framework provides a comprehensive set of tools and utilities for implementing and experimenting with Extreme Learning Machines using Python and TensorFlow. ELMs are a type of machine learning algorithm known for their simplicity, efficiency, and scalability, making them ideal for various applications, including classification, regression, and feature learning tasks.
upload_time2024-08-20 09:28:42
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseCreative Commons Legal Code CC BY-NC 4.0 Attribution-NonCommercial 4.0 International ======================================================================= Creative Commons Corporation ("Creative Commons") is not a law firm and does not provide legal services or legal advice. Distribution of Creative Commons public licenses does not create a lawyer-client or other relationship. Creative Commons makes its licenses and related information available on an "as-is" basis. Creative Commons gives no warranties regarding its licenses, any material licensed under their terms and conditions, or any related information. Creative Commons disclaims all liability for damages resulting from their use to the fullest extent possible. Using Creative Commons Public Licenses Creative Commons public licenses provide a standard set of terms and conditions that creators and other rights holders may use to share original works of authorship and other material subject to copyright and certain other rights specified in the public license below. The following considerations are for informational purposes only, are not exhaustive, and do not form part of our licenses. Considerations for licensors: Our public licenses are intended for use by those authorized to give the public permission to use material in ways otherwise restricted by copyright and certain other rights. Our licenses are irrevocable. Licensors should read and understand the terms and conditions of the license they choose before applying it. Licensors should also secure all rights necessary before applying our licenses so that the public can reuse the material as expected. Licensors should clearly mark any material not subject to the license. This includes other CC-licensed material, or material used under an exception or limitation to copyright. More considerations for licensors: https://creativecommons.org/share-your-work/licensing-considerations/ Considerations for the public: By using one of our public licenses, a licensor grants the public permission to use the licensed material under specified terms and conditions. If the licensor's permission is not necessary for any reason– for example, because of any applicable exception or limitation to copyright–then that use is not regulated by the license. Our licenses grant only permissions under copyright and certain other rights that a licensor has authority to grant. Use of the licensed material may still be restricted for other reasons, including because others have copyright or other rights in the material. A licensor may make special requests, such as asking that all changes be marked or described. Although not required by our licenses, you are encouraged to respect those requests where reasonable. More considerations for the public: https://creativecommons.org/share-your-work/licensing-considerations/ Creative Commons Attribution-NonCommercial 4.0 International Public License By exercising the Licensed Rights (defined below), You accept and agree to be bound by the terms and conditions of this Creative Commons Attribution-NonCommercial 4.0 International Public License ("Public License"). To the extent this Public License may be interpreted as a contract, You are granted the Licensed Rights in consideration of Your acceptance of these terms and conditions, and the Licensor grants You such rights in consideration of benefits the Licensor receives from making the Licensed Material available under these terms and conditions. Section 1 – Definitions. Adapted Material means material subject to Copyright and Similar Rights that is derived from or based upon the Licensed Material and in which the Licensed Material is translated, altered, arranged, transformed, or otherwise modified in a manner requiring permission under the Copyright and Similar Rights held by the Licensor. For purposes of this Public License, where the Licensed Material is a musical work, performance, or sound recording, Adapted Material is always produced where the Licensed Material is synched in timed relation with a moving image. Adapter's License means the license You apply to Your Copyright and Similar Rights in Your contributions to Adapted Material in accordance with the terms and conditions of this Public License. BY-NC Compatible License means a license listed at https://creativecommons.org/compatiblelicenses, approved by Creative Commons as essentially the equivalent of this Public License. Copyright and Similar Rights means copyright and/or similar rights closely related to copyright including, without limitation, performance, broadcast, sound recording, and Sui Generis Database Rights, without regard to how the rights are labeled or categorized. For purposes of this Public License, the rights specified in Section 2(b)(1)-(2) are not Copyright and Similar Rights. Effective Technological Measures means those measures that, in the absence of proper authority, may not be circumvented under laws fulfilling obligations under Article 11 of the WIPO Copyright Treaty adopted on December 20, 1996, and/or similar international agreements. Exceptions and Limitations means fair use, fair dealing, and/or any other exception or limitation to Copyright and Similar Rights that applies to Your use of the Licensed Material. Licensed Material means the artistic or literary work, database, or other material to which the Licensor applied this Public License. Licensed Rights means the rights granted to You subject to the terms and conditions of this Public License, which are limited to all Copyright and Similar Rights that apply to Your use of the Licensed Material and that the Licensor has authority to license. For purposes of this Public License, simply using the Licensed Material does not automatically result in You acquiring any rights under the License. All rights not expressly granted by the Licensor are reserved. See Section 1(c). NonCommercial means not primarily intended for or directed towards commercial advantage or monetary compensation. For purposes of this Public License, the exchange of the Licensed Material for other material subject to Copyright and Similar Rights by digital file-sharing or similar means is NonCommercial provided there is no payment of monetary compensation in connection with the exchange. Share means to provide material to the public by any means or process that requires permission under the Licensed Rights, such as reproduction, public display, public performance, distribution, dissemination, communication, or importation, and to make material available to the public including in ways that members of the public may access the material from a place and at a time individually chosen by them. Sui Generis Database Rights means rights other than copyright resulting from Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, as amended and/or succeeded, as well as other essentially equivalent rights anywhere in the world. You means the individual or entity exercising the Licensed Rights under this Public License. Your has a corresponding meaning. Section 2 – Scope. License grant. Subject to the terms and conditions of this Public License, the Licensor hereby grants You a worldwide, royalty-free, non-sublicensable, non-exclusive, irrevocable license to exercise the Licensed Rights in the Licensed Material to: reproduce and Share the Licensed Material, in whole or in part, for NonCommercial purposes only; and produce, reproduce, and Share Adapted Material for NonCommercial purposes only. Exceptions and Limitations. For the avoidance of doubt, where Exceptions and Limitations apply to Your use, this Public License does not apply, and You do not need to comply with its terms and conditions. Term. The term of this Public License is specified in Section 6(a). Media and formats; technical modifications allowed. The Licensor authorizes You to exercise the Licensed Rights in all media and formats whether now known or hereafter created, and to make technical modifications necessary to do so. The Licensor waives and/or agrees not to assert any right or authority to forbid You from making technical modifications necessary to exercise the Licensed Rights, including technical modifications necessary to circumvent Effective Technological Measures. For purposes of this Public License, simply making modifications authorized by this Section 2(a)(4) never produces Adapted Material. Downstream recipients. Offer from the Licensor – Licensed Material. Every recipient of the Licensed Material automatically receives an offer from the Licensor to exercise the Licensed Rights under the terms and conditions of this Public License. Additional offer from the Licensor – Adapted Material. Every recipient of Adapted Material from You automatically receives an offer from the Licensor to exercise the Licensed Rights in the Adapted Material under the conditions of the Adapter's License You apply. No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material. No endorsement. Nothing in this Public License constitutes or may be construed as permission to assert or imply that You are, or that Your use of the Licensed Material is, connected with, or sponsored, endorsed, or granted official status by, the Licensor or others designated to receive attribution as provided in Section 3(a)(1)(A)(i). Section 3 – License Conditions. Your exercise of the Licensed Rights is expressly made subject to the following conditions. Attribution. If You Share the Licensed Material (including in modified form), You must: retain the following if it is supplied by the Licensor with the Licensed Material: identification of the creator(s) of the Licensed Material and any others designated to receive attribution, in any reasonable manner requested by the Licensor (including by pseudonym if designated); a copyright notice; a notice that refers to this Public License; a notice that refers to the disclaimer of warranties; a URI or hyperlink to the Licensed Material to the extent reasonably practicable; indicate if You modified the Licensed Material and retain an indication of any previous modifications; and indicate the Licensed Material is licensed under this Public License, and include the text of, or the URI or hyperlink to, this Public License. NonCommercial Restrictions. The licensor permits others to copy, distribute and transmit the work. In return, licensees may not use the work for commercial purposes — unless they get the licensor's permission. Section 4 – Sui Generis Database Rights. Where the Licensed Rights include Sui Generis Database Rights that apply to Your use of the Licensed Material: for the avoidance of doubt, Section 2(a)(1) grants You the right to extract, reuse, reproduce, and Share all or a substantial portion of the contents of the database; if You include all or a substantial portion of the database contents in a database in which You have Sui Generis Database Rights, then the database in which You have Sui Generis Database Rights (but not its individual contents) is Adapted Material, including for purposes of Section 3(b); and You must comply with the conditions in Section 3(a) if You Share all or a substantial portion of the contents of the database. For the avoidance of doubt, this Section 4 supplements and does not replace Your obligations under this Public License where the Licensed Rights include other Copyright and Similar Rights. Section 5 – Disclaimer of Warranties and Limitation of Liability. Unless otherwise separately undertaken by the Licensor, to the extent possible, the Licensor offers the Licensed Material as-is and as-available, and makes no representations or warranties of any kind concerning the Licensed Material, whether express, implied, statutory, or other. This includes, without limitation, warranties of title, merchantability, fitness for a particular purpose, non-infringement, absence of latent or other defects, accuracy, or the presence or absence of errors, whether or not known or discoverable. Where disclaimers of warranties are not allowed in full or in part, this disclaimer may not apply to You. To the extent possible, in no event will the Licensor be liable to You on any legal theory (including, without limitation, negligence) or otherwise for any direct, special, indirect, incidental, consequential, punitive, exemplary, or other losses, costs, expenses, or damages arising out of this Public License or use of the Licensed Material, even if the Licensor has been advised of the possibility of such losses, costs, expenses, or damages. Where a limitation of liability is not allowed in full or in part, this limitation may not apply to You. The disclaimer of warranties and limitation of liability provided above shall be interpreted in a manner that, to the extent possible, most closely approximates an absolute disclaimer and waiver of all liability. Section 6 – Term and Termination. This Public License applies for the term of the Copyright and Similar Rights licensed here. However, if You fail to comply with this Public License, then Your rights under this Public License terminate automatically. Where Your right to use the Licensed Material has terminated under Section 6(a), it reinstates: automatically as of the date the violation is cured, provided it is cured within 30 days of Your discovery of the violation; or upon express reinstatement by the Licensor. For the avoidance of doubt, this Section 6(b) does not affect any right the Licensor may have to seek remedies for Your violations of this Public License. For the avoidance of doubt, the Licensor may also offer the Licensed Material under separate terms or conditions or stop distributing the Licensed Material at any time; however, doing so will not terminate this Public License. Sections 1, 5, 6, 7, and 8 survive termination of this Public License. Section 7 – Other Terms and Conditions. The Licensor shall not be bound by any additional or different terms or conditions communicated by You unless expressly agreed. Any arrangements, understandings, or agreements regarding the Licensed Material not stated herein are separate from and independent of the terms and conditions of this Public License. Section 8 – Interpretation. For the avoidance of doubt, this Public License does not, and shall not be interpreted to, reduce, limit, restrict, or impose conditions on any use of the Licensed Material that could lawfully be made without permission under this Public License. To the extent possible, if any provision of this Public License is deemed unenforceable, it shall be automatically reformed to the minimum extent necessary to make it enforceable. If the provision cannot be reformed, it shall be severed from this Public License without affecting the enforceability of the remaining terms and conditions. No term or condition of this Public License will be waived and no failure to comply consented to unless expressly agreed to by the Licensor. Nothing in this Public License constitutes or may be interpreted as a limitation upon, or waiver of, any privileges and immunities that apply to the Licensor or You, including from the legal processes of any jurisdiction or authority.
keywords extreme learning machine tensorflow python neural networks artificial intelligence machine learning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # TfELM : Extreme Learning Machines Framework with Python and TensorFlow

This framework provides a comprehensive set of tools and utilities for implementing and experimenting with Extreme Learning Machines using Python and TensorFlow. ELMs are a type of machine learning algorithm known for their simplicity, efficiency, and scalability, making them ideal for various applications, including classification, regression, and feature learning tasks.

![ELM Framework Logo](TfELM-logo.png)

## Table of Contents

- [Introduction](#introduction)
- [Features](#features)
- [Documentation](#documentation)
- [Installation](#installation)
- [Usage](#usage)
- [Examples](#examples)
- [License](#license)

## Introduction

Extreme Learning Machines (ELMs) represent a class of feedforward neural networks initially proposed by Huang et al. in 2006. Traditional neural networks typically involve iterative optimization methods to learn both input weights and hidden layer biases, leading to computationally intensive training processes. In contrast, ELMs revolutionize this paradigm by adopting a simplified approach where input weights and biases are randomly initialized, and only the output weights are learned through a single linear regression. This architectural choice not only drastically reduces training time but also alleviates computational complexity, rendering ELMs highly efficient for tackling large-scale problems.

### Basic Extreme Learning Machine concept
![ELM Framework Logo](https://github.com/KStruniawski/TfELM/blob/main/elm.png)
Let's denote the input data matrix as $`X`$ 
of size $`N \times M`$, where $`N`$ represents the 
number of samples and $`M`$ denotes the number of features. 
The hidden layer of an ELM consists of $`K`$ 
hidden neurons with random input weights 
$`W`$ of size $`M \times K`$
and biases $`b`$ of size $`1 \times K`$. 
The output weights $`beta`$ are learned through a 
linear regression approach. Given the activation function 
$`g(\cdot)`$, the output of the hidden layer can be computed as:

$`H = g(X\mathbf{W} + \mathbf{b})`$

Where $`H`$ is the hidden layer output matrix of size 
$`N \times K`$. The output weights $`beta`$ 
can then be determined by solving the linear regression problem:

$`beta = (H^TH)^{-1}H^TY `$

Where $`Y`$ is the target output matrix of size $`N \times L`$,
with $`L`$ representing the number of output neurons.

### Framework Overview

This repository offers a versatile and user-friendly framework for implementing and experimenting with ELMs using Python and TensorFlow. Researchers and practitioners alike can leverage this framework to explore novel machine learning techniques or seek efficient solutions to real-world problems. With its powerful toolkit for building and evaluating ELM models, this repository aims to facilitate seamless experimentation and deployment of ELM-based solutions across various domains.

### Scarcity of ELM Implementations and the Need for a Comprehensive Framework

Despite the effectiveness and efficiency of Extreme Learning Machines (ELMs), there is a noticeable scarcity of comprehensive implementations in the machine learning community. Existing implementations are often fragmented, with individual researchers providing standalone implementations tailored to specific research papers. These implementations lack the versatility required for broader applications and may not always be reliable, with some failing to produce accurate results. Moreover, many existing implementations are solely available in MATLAB, restricting accessibility and hindering collaboration in the open-source community.

To address these limitations, this framework fills a crucial gap by providing a comprehensive and user-friendly toolkit for implementing and experimenting with ELMs using Python and TensorFlow. Leveraging TensorFlow for computation offers several advantages, including the ability to harness the power of GPUs for accelerated training. TensorFlow's efficient GPU utilization significantly speeds up computation, making it ideal for handling large-scale datasets and computationally intensive tasks. Additionally, the framework supports both GPU and CPU execution, ensuring compatibility with various computing environments.

Furthermore, the framework facilitates model persistence by enabling users to save and load ELM models using the standard HDF5 format provided by TensorFlow. This feature ensures seamless model deployment and sharing, allowing researchers and practitioners to collaborate and build upon each other's work effectively.

## Features
### Overview
- **Comprehensive Framework**: In response to the scarcity of comprehensive implementations for Extreme Learning Machines (ELMs), this framework provides a robust and all-encompassing solution. Unlike fragmented implementations scattered across various research papers, this framework offers a unified toolkit tailored for versatile applications.
- **Efficient Training with TensorFlow**: Harnessing the power of TensorFlow, the framework ensures efficient training of ELM models. TensorFlow's GPU acceleration significantly speeds up computation, making it suitable for handling large-scale datasets and time-sensitive applications. Moreover, the framework supports both GPU and CPU execution, ensuring compatibility with diverse computing environments.
- **Versatile Applications**: ELMs are renowned for their versatility, capable of tackling a wide range of machine learning tasks, including classification, regression, and feature learning. With this framework, users can effortlessly apply ELMs to various real-world problems, thanks to its user-friendly interface and extensive functionalities.
- **Model Persistence and Portability**: The framework enables seamless model persistence by allowing users to save and load ELM models using the standard HDF5 format provided by TensorFlow. This feature facilitates model deployment, sharing, and collaboration, empowering researchers and practitioners to build upon each other's work effectively.
- **Modular Design and Extensibility**: Designed with a modular architecture, the framework empowers users to customize and extend various components of the ELM model effortlessly. This flexibility ensures adaptability to diverse research requirements and facilitates the integration of novel techniques and algorithms.
- **Comprehensive Documentation and Examples**: Detailed documentation and usage examples accompany the framework, providing users with comprehensive guidance on understanding and implementing ELMs. These resources facilitate rapid onboarding and empower users to leverage the full potential of the framework in their research and applications.
### Implemented Variants of Extreme Learning Machines

1. **Basic Extreme Learning Machine (ELM)**:
    - Implementation with 108 various activation functions available for regression and classification tasks. Integration with scikit-learn functions like cross_val_score for straightforward accessibility.

2. **Constrained Extreme Learning Machine (CELM)**:
    - Specifically designed for imbalanced datasets, generates weights based on class differences in the input data rather than random initialization.

3. **Deep Extreme Learning Machine (DeepELM)**:
    - Incorporates multiple layers, with each layer consisting of a type of ELM layer, enabling deep learning capabilities.

4. **Deep Representation Extreme Learning Machine (DrELM)**:
    - Utilizes random projection on input data, followed by shift and kernel function application to obtain beta, forming a block that can be stacked together for multilayer architecture.

5. **Enhanced Deep Representation Extreme Learning Machine (EHDrELM)**:
    - Incorporates skip connections from residual networks to consider the output of preceding blocks in calculating shifted mapping.

6. **Graph Regularized Extreme Learning Machine Autoencoder (GELM-AE)**:
    - Learns feature representation through unsupervised learning using the regularized Graph Laplacian concept.

7. **Kernel Extreme Learning Machine (KELM)**:
    - Utilizes Mercer Kernels to generate hidden layer output without defining the number of neurons, with support for various kernel functions and kernel matrix approximations using the Nystrom method.

8. **Local Receptive Field Extreme Learning Machine (LRF-ELM)**:
    - Implements randomly generated convolutional hidden nodes for feature mapping, suitable for classification tasks.

9. **Metaheuristic Algorithms for Extreme Learning Machine Optimization (MA-ELM)**:
    - Optimizes random weights for specific tasks using Metaheuristic Algorithms from the comprehensive mealpy package, enabling efficient population-based metaheuristic optimization.

10. **Multi-layer Extreme Learning Machine (ML-ELM)**:
    - Consists of multiple layers of ELM units (AE-ELM) for feature extraction followed by a final ELM layer for classification, adaptable to various layer configurations.

11. **Online Sequential Extreme Learning Machine (OS-ELM)**:
    - Utilizes the Woodbury matrix identity concept for incremental learning chunk-by-chunk of data.

12. **Regularized Extreme Learning Machine (RELM)**:
    - Applies regularization parameter optimization, with support for optimizing l1, l2, or combined norms using optimizers like FISTA, ISTA, LGFGBS, and PGD.

13. **Receptive Fields Extreme Learning Machine (RF-ELM)**:
    - Randomly generates random fields on top of randomly generated input weights in ELM, enhancing feature representation.

14. **Residual Compensation Extreme Learning Machine (RC-ELM)**:
    - Residual version of ELM designed for regression tasks, incorporating residual learning techniques.

15. **Semi-Supervised Extreme Learning Machine (SS-ELM)**:
    - Suitable for tasks with labeled and unlabeled data, featuring custom methods for dataset splitting and evaluation.

16. **Subnetwork Extreme Learning Machine (SubELM)**:
    - Implements the SubELM algorithm, dividing the input space into subspaces and learning separate subnetworks for each subspace using random feature mapping and activation functions.

17. **Unsupervised Extreme Learning Machine (US-ELM)**:
    - Designed for unsupervised tasks like data embedding or clustering, leveraging ELM's capabilities for unsupervised learning.

18. **Weighted Extreme Learning Machine (WELM)**:
    - Incorporates weighted samples during training using methods like 'wei-1', 'wei-2', 'ban-1', 'ban-decay'.

For all the mentioned methods, there is support for saving and loading mechanisms using HDF5 format. Additionally, they are compatible with cross_val_score, providing both predict and predict_proba functionalities for ROC/AUC curve calculations. Each algorithm runs on TensorFlow, ensuring efficient computation and compatibility with various computing environments.

## Documentation
[Documentation](https://kstruniawski.github.io/TfELM/) is available at the provided link.

## Installation

To install the ELM Framework, simply clone this repository to your local machine:

```bash
git clone https://github.com/KStruniawski/TfELM.git
```
Then, navigate to the cloned directory and install the required dependencies using pip:
```bash
cd elm-framework
pip install -r requirements.txt
```

## Usage

In this package, the folder structure is organized to facilitate easy navigation and utilization of different components. The **Data** folder contains exemplary datasets sourced from UCI, providing users with readily available data for experimentation. Within the **Examples** directory, users can find sample code demonstrating the application of various Extreme Learning Machine (ELM) methods. The **Layers** folder houses ELM layers, which can be directly employed or, preferably, utilized through the **Models** directory. The Models directory is structured following well-established patterns, facilitating usage through familiar fit/predict/predict_proba methods. Additionally, integration with scikit-learn functions like cross_val_score for cross-validated data is supported. The **Optimisers** folder contains optimizers such as the MA optimizer or optimizers for applying l1/l2 norms in Regularized ELM (RELM). Lastly, the **Resources** directory contains various additional scripts that support the functionality of the entire package. This organized structure aims to streamline the process of accessing and utilizing different components of the ELM package.

To utilize the ELM Framework, follow these steps in your Python code:
```pythoncode
import numpy as np
import pandas as pd
from sklearn.metrics import accuracy_score
from sklearn.model_selection import RepeatedKFold, cross_val_score
from sklearn.preprocessing import LabelEncoder
from sklearn import preprocessing

from Layers.ELMLayer import ELMLayer
from Models.ELMModel import ELMModel


# Hyperparameters:
num_neurons = 1000
n_splits = 10
n_repeats = 50

# Loading sample dataset from Data folder
path = "../Data/ionosphere.txt"
df = pd.read_csv(path, delimiter='\t').fillna(0)
X = df.values[:, 1:]
y = df.values[:, 0]

# Label encoding and features normalization
label_encoder = LabelEncoder()
y = label_encoder.fit_transform(y)  # Encode class labels to numerical values
X = preprocessing.normalize(X)  # Normalize feature vectors

# Initialize an Extreme Learning Machine (ELM) layer
elm = ELMLayer(number_neurons=num_neurons, activation='mish')

# Create an ELM model using the trained ELM layer
model = ELMModel(elm)

# Define a cross-validation strategy
cv = RepeatedKFold(n_splits=n_splits, n_repeats=n_repeats)

# Perform cross-validation to evaluate the model performance
scores = cross_val_score(model, X, y, cv=cv, scoring='accuracy', error_score='raise')

# Print the mean accuracy score obtained from cross-validation
print(np.mean(scores))

# Fit the ELM model to the entire dataset
model.fit(X, y)

# Save the trained model to a file
model.save("Saved Models/ELM_Model.h5")

# Load the saved model from the file
model = model.load("Saved Models/ELM_Model.h5")

# Evaluate the accuracy of the model on the training data
acc = accuracy_score(model.predict(X), y)
print(acc)
```
For more detailed usage instructions and examples, refer to the documentation and examples provided in the repository.

## Examples
The examples directory contains various usage examples demonstrating how to use the ELM Framework for different machine learning tasks, including classification, regression, and feature learning.

To run the examples, simply navigate to the desired example directory and execute the Python script:
```bash
cd Examples
python ELM.py
```

## Sample datasets
This repository utilizes sample datasets sourced from the UCI Machine Learning Repository.

## License
Shield: [![CC BY-NC 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]

This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].

[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]

[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "TfELM",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "Karol Struniawski <struniawski.karol@gmail.com>",
    "keywords": "Extreme Learning Machine, TensorFlow, Python, Neural networks, Artificial intelligence, Machine learning",
    "author": null,
    "author_email": "Karol Struniawski <struniawski.karol@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/cf/9c/07e485cb5910e0675c6fa042ed9590db31ad8822cb2d5b59ad53534b164e/tfelm-1.1.tar.gz",
    "platform": null,
    "description": "# TfELM : Extreme Learning Machines Framework with Python and TensorFlow\r\n\r\nThis framework provides a comprehensive set of tools and utilities for implementing and experimenting with Extreme Learning Machines using Python and TensorFlow. ELMs are a type of machine learning algorithm known for their simplicity, efficiency, and scalability, making them ideal for various applications, including classification, regression, and feature learning tasks.\r\n\r\n![ELM Framework Logo](TfELM-logo.png)\r\n\r\n## Table of Contents\r\n\r\n- [Introduction](#introduction)\r\n- [Features](#features)\r\n- [Documentation](#documentation)\r\n- [Installation](#installation)\r\n- [Usage](#usage)\r\n- [Examples](#examples)\r\n- [License](#license)\r\n\r\n## Introduction\r\n\r\nExtreme Learning Machines (ELMs) represent a class of feedforward neural networks initially proposed by Huang et al. in 2006. Traditional neural networks typically involve iterative optimization methods to learn both input weights and hidden layer biases, leading to computationally intensive training processes. In contrast, ELMs revolutionize this paradigm by adopting a simplified approach where input weights and biases are randomly initialized, and only the output weights are learned through a single linear regression. This architectural choice not only drastically reduces training time but also alleviates computational complexity, rendering ELMs highly efficient for tackling large-scale problems.\r\n\r\n### Basic Extreme Learning Machine concept\r\n![ELM Framework Logo](https://github.com/KStruniawski/TfELM/blob/main/elm.png)\r\nLet's denote the input data matrix as $`X`$ \r\nof size $`N \\times M`$, where $`N`$ represents the \r\nnumber of samples and $`M`$ denotes the number of features. \r\nThe hidden layer of an ELM consists of $`K`$ \r\nhidden neurons with random input weights \r\n$`W`$ of size $`M \\times K`$\r\nand biases $`b`$ of size $`1 \\times K`$. \r\nThe output weights $`beta`$ are learned through a \r\nlinear regression approach. Given the activation function \r\n$`g(\\cdot)`$, the output of the hidden layer can be computed as:\r\n\r\n$`H = g(X\\mathbf{W} + \\mathbf{b})`$\r\n\r\nWhere $`H`$ is the hidden layer output matrix of size \r\n$`N \\times K`$. The output weights $`beta`$ \r\ncan then be determined by solving the linear regression problem:\r\n\r\n$`beta = (H^TH)^{-1}H^TY `$\r\n\r\nWhere $`Y`$ is the target output matrix of size $`N \\times L`$,\r\nwith $`L`$ representing the number of output neurons.\r\n\r\n### Framework Overview\r\n\r\nThis repository offers a versatile and user-friendly framework for implementing and experimenting with ELMs using Python and TensorFlow. Researchers and practitioners alike can leverage this framework to explore novel machine learning techniques or seek efficient solutions to real-world problems. With its powerful toolkit for building and evaluating ELM models, this repository aims to facilitate seamless experimentation and deployment of ELM-based solutions across various domains.\r\n\r\n### Scarcity of ELM Implementations and the Need for a Comprehensive Framework\r\n\r\nDespite the effectiveness and efficiency of Extreme Learning Machines (ELMs), there is a noticeable scarcity of comprehensive implementations in the machine learning community. Existing implementations are often fragmented, with individual researchers providing standalone implementations tailored to specific research papers. These implementations lack the versatility required for broader applications and may not always be reliable, with some failing to produce accurate results. Moreover, many existing implementations are solely available in MATLAB, restricting accessibility and hindering collaboration in the open-source community.\r\n\r\nTo address these limitations, this framework fills a crucial gap by providing a comprehensive and user-friendly toolkit for implementing and experimenting with ELMs using Python and TensorFlow. Leveraging TensorFlow for computation offers several advantages, including the ability to harness the power of GPUs for accelerated training. TensorFlow's efficient GPU utilization significantly speeds up computation, making it ideal for handling large-scale datasets and computationally intensive tasks. Additionally, the framework supports both GPU and CPU execution, ensuring compatibility with various computing environments.\r\n\r\nFurthermore, the framework facilitates model persistence by enabling users to save and load ELM models using the standard HDF5 format provided by TensorFlow. This feature ensures seamless model deployment and sharing, allowing researchers and practitioners to collaborate and build upon each other's work effectively.\r\n\r\n## Features\r\n### Overview\r\n- **Comprehensive Framework**: In response to the scarcity of comprehensive implementations for Extreme Learning Machines (ELMs), this framework provides a robust and all-encompassing solution. Unlike fragmented implementations scattered across various research papers, this framework offers a unified toolkit tailored for versatile applications.\r\n- **Efficient Training with TensorFlow**: Harnessing the power of TensorFlow, the framework ensures efficient training of ELM models. TensorFlow's GPU acceleration significantly speeds up computation, making it suitable for handling large-scale datasets and time-sensitive applications. Moreover, the framework supports both GPU and CPU execution, ensuring compatibility with diverse computing environments.\r\n- **Versatile Applications**: ELMs are renowned for their versatility, capable of tackling a wide range of machine learning tasks, including classification, regression, and feature learning. With this framework, users can effortlessly apply ELMs to various real-world problems, thanks to its user-friendly interface and extensive functionalities.\r\n- **Model Persistence and Portability**: The framework enables seamless model persistence by allowing users to save and load ELM models using the standard HDF5 format provided by TensorFlow. This feature facilitates model deployment, sharing, and collaboration, empowering researchers and practitioners to build upon each other's work effectively.\r\n- **Modular Design and Extensibility**: Designed with a modular architecture, the framework empowers users to customize and extend various components of the ELM model effortlessly. This flexibility ensures adaptability to diverse research requirements and facilitates the integration of novel techniques and algorithms.\r\n- **Comprehensive Documentation and Examples**: Detailed documentation and usage examples accompany the framework, providing users with comprehensive guidance on understanding and implementing ELMs. These resources facilitate rapid onboarding and empower users to leverage the full potential of the framework in their research and applications.\r\n### Implemented Variants of Extreme Learning Machines\r\n\r\n1. **Basic Extreme Learning Machine (ELM)**:\r\n    - Implementation with 108 various activation functions available for regression and classification tasks. Integration with scikit-learn functions like cross_val_score for straightforward accessibility.\r\n\r\n2. **Constrained Extreme Learning Machine (CELM)**:\r\n    - Specifically designed for imbalanced datasets, generates weights based on class differences in the input data rather than random initialization.\r\n\r\n3. **Deep Extreme Learning Machine (DeepELM)**:\r\n    - Incorporates multiple layers, with each layer consisting of a type of ELM layer, enabling deep learning capabilities.\r\n\r\n4. **Deep Representation Extreme Learning Machine (DrELM)**:\r\n    - Utilizes random projection on input data, followed by shift and kernel function application to obtain beta, forming a block that can be stacked together for multilayer architecture.\r\n\r\n5. **Enhanced Deep Representation Extreme Learning Machine (EHDrELM)**:\r\n    - Incorporates skip connections from residual networks to consider the output of preceding blocks in calculating shifted mapping.\r\n\r\n6. **Graph Regularized Extreme Learning Machine Autoencoder (GELM-AE)**:\r\n    - Learns feature representation through unsupervised learning using the regularized Graph Laplacian concept.\r\n\r\n7. **Kernel Extreme Learning Machine (KELM)**:\r\n    - Utilizes Mercer Kernels to generate hidden layer output without defining the number of neurons, with support for various kernel functions and kernel matrix approximations using the Nystrom method.\r\n\r\n8. **Local Receptive Field Extreme Learning Machine (LRF-ELM)**:\r\n    - Implements randomly generated convolutional hidden nodes for feature mapping, suitable for classification tasks.\r\n\r\n9. **Metaheuristic Algorithms for Extreme Learning Machine Optimization (MA-ELM)**:\r\n    - Optimizes random weights for specific tasks using Metaheuristic Algorithms from the comprehensive mealpy package, enabling efficient population-based metaheuristic optimization.\r\n\r\n10. **Multi-layer Extreme Learning Machine (ML-ELM)**:\r\n    - Consists of multiple layers of ELM units (AE-ELM) for feature extraction followed by a final ELM layer for classification, adaptable to various layer configurations.\r\n\r\n11. **Online Sequential Extreme Learning Machine (OS-ELM)**:\r\n    - Utilizes the Woodbury matrix identity concept for incremental learning chunk-by-chunk of data.\r\n\r\n12. **Regularized Extreme Learning Machine (RELM)**:\r\n    - Applies regularization parameter optimization, with support for optimizing l1, l2, or combined norms using optimizers like FISTA, ISTA, LGFGBS, and PGD.\r\n\r\n13. **Receptive Fields Extreme Learning Machine (RF-ELM)**:\r\n    - Randomly generates random fields on top of randomly generated input weights in ELM, enhancing feature representation.\r\n\r\n14. **Residual Compensation Extreme Learning Machine (RC-ELM)**:\r\n    - Residual version of ELM designed for regression tasks, incorporating residual learning techniques.\r\n\r\n15. **Semi-Supervised Extreme Learning Machine (SS-ELM)**:\r\n    - Suitable for tasks with labeled and unlabeled data, featuring custom methods for dataset splitting and evaluation.\r\n\r\n16. **Subnetwork Extreme Learning Machine (SubELM)**:\r\n    - Implements the SubELM algorithm, dividing the input space into subspaces and learning separate subnetworks for each subspace using random feature mapping and activation functions.\r\n\r\n17. **Unsupervised Extreme Learning Machine (US-ELM)**:\r\n    - Designed for unsupervised tasks like data embedding or clustering, leveraging ELM's capabilities for unsupervised learning.\r\n\r\n18. **Weighted Extreme Learning Machine (WELM)**:\r\n    - Incorporates weighted samples during training using methods like 'wei-1', 'wei-2', 'ban-1', 'ban-decay'.\r\n\r\nFor all the mentioned methods, there is support for saving and loading mechanisms using HDF5 format. Additionally, they are compatible with cross_val_score, providing both predict and predict_proba functionalities for ROC/AUC curve calculations. Each algorithm runs on TensorFlow, ensuring efficient computation and compatibility with various computing environments.\r\n\r\n## Documentation\r\n[Documentation](https://kstruniawski.github.io/TfELM/) is available at the provided link.\r\n\r\n## Installation\r\n\r\nTo install the ELM Framework, simply clone this repository to your local machine:\r\n\r\n```bash\r\ngit clone https://github.com/KStruniawski/TfELM.git\r\n```\r\nThen, navigate to the cloned directory and install the required dependencies using pip:\r\n```bash\r\ncd elm-framework\r\npip install -r requirements.txt\r\n```\r\n\r\n## Usage\r\n\r\nIn this package, the folder structure is organized to facilitate easy navigation and utilization of different components. The **Data** folder contains exemplary datasets sourced from UCI, providing users with readily available data for experimentation. Within the **Examples** directory, users can find sample code demonstrating the application of various Extreme Learning Machine (ELM) methods. The **Layers** folder houses ELM layers, which can be directly employed or, preferably, utilized through the **Models** directory. The Models directory is structured following well-established patterns, facilitating usage through familiar fit/predict/predict_proba methods. Additionally, integration with scikit-learn functions like cross_val_score for cross-validated data is supported. The **Optimisers** folder contains optimizers such as the MA optimizer or optimizers for applying l1/l2 norms in Regularized ELM (RELM). Lastly, the **Resources** directory contains various additional scripts that support the functionality of the entire package. This organized structure aims to streamline the process of accessing and utilizing different components of the ELM package.\r\n\r\nTo utilize the ELM Framework, follow these steps in your Python code:\r\n```pythoncode\r\nimport numpy as np\r\nimport pandas as pd\r\nfrom sklearn.metrics import accuracy_score\r\nfrom sklearn.model_selection import RepeatedKFold, cross_val_score\r\nfrom sklearn.preprocessing import LabelEncoder\r\nfrom sklearn import preprocessing\r\n\r\nfrom Layers.ELMLayer import ELMLayer\r\nfrom Models.ELMModel import ELMModel\r\n\r\n\r\n# Hyperparameters:\r\nnum_neurons = 1000\r\nn_splits = 10\r\nn_repeats = 50\r\n\r\n# Loading sample dataset from Data folder\r\npath = \"../Data/ionosphere.txt\"\r\ndf = pd.read_csv(path, delimiter='\\t').fillna(0)\r\nX = df.values[:, 1:]\r\ny = df.values[:, 0]\r\n\r\n# Label encoding and features normalization\r\nlabel_encoder = LabelEncoder()\r\ny = label_encoder.fit_transform(y)  # Encode class labels to numerical values\r\nX = preprocessing.normalize(X)  # Normalize feature vectors\r\n\r\n# Initialize an Extreme Learning Machine (ELM) layer\r\nelm = ELMLayer(number_neurons=num_neurons, activation='mish')\r\n\r\n# Create an ELM model using the trained ELM layer\r\nmodel = ELMModel(elm)\r\n\r\n# Define a cross-validation strategy\r\ncv = RepeatedKFold(n_splits=n_splits, n_repeats=n_repeats)\r\n\r\n# Perform cross-validation to evaluate the model performance\r\nscores = cross_val_score(model, X, y, cv=cv, scoring='accuracy', error_score='raise')\r\n\r\n# Print the mean accuracy score obtained from cross-validation\r\nprint(np.mean(scores))\r\n\r\n# Fit the ELM model to the entire dataset\r\nmodel.fit(X, y)\r\n\r\n# Save the trained model to a file\r\nmodel.save(\"Saved Models/ELM_Model.h5\")\r\n\r\n# Load the saved model from the file\r\nmodel = model.load(\"Saved Models/ELM_Model.h5\")\r\n\r\n# Evaluate the accuracy of the model on the training data\r\nacc = accuracy_score(model.predict(X), y)\r\nprint(acc)\r\n```\r\nFor more detailed usage instructions and examples, refer to the documentation and examples provided in the repository.\r\n\r\n## Examples\r\nThe examples directory contains various usage examples demonstrating how to use the ELM Framework for different machine learning tasks, including classification, regression, and feature learning.\r\n\r\nTo run the examples, simply navigate to the desired example directory and execute the Python script:\r\n```bash\r\ncd Examples\r\npython ELM.py\r\n```\r\n\r\n## Sample datasets\r\nThis repository utilizes sample datasets sourced from the UCI Machine Learning Repository.\r\n\r\n## License\r\nShield: [![CC BY-NC 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]\r\n\r\nThis work is licensed under a\r\n[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].\r\n\r\n[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]\r\n\r\n[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/\r\n[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png\r\n[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg\r\n",
    "bugtrack_url": null,
    "license": "Creative Commons Legal Code  CC BY-NC 4.0  Attribution-NonCommercial 4.0 International  =======================================================================  Creative Commons Corporation (\"Creative Commons\") is not a law firm and does not provide legal services or legal advice. Distribution of Creative Commons public licenses does not create a lawyer-client or other relationship. Creative Commons makes its licenses and related information available on an \"as-is\" basis. Creative Commons gives no warranties regarding its licenses, any material licensed under their terms and conditions, or any related information. Creative Commons disclaims all liability for damages resulting from their use to the fullest extent possible.  Using Creative Commons Public Licenses  Creative Commons public licenses provide a standard set of terms and conditions that creators and other rights holders may use to share original works of authorship and other material subject to copyright and certain other rights specified in the public license below. The following considerations are for informational purposes only, are not exhaustive, and do not form part of our licenses.  Considerations for licensors: Our public licenses are intended for use by those authorized to give the public permission to use material in ways otherwise restricted by copyright and certain other rights. Our licenses are irrevocable. Licensors should read and understand the terms and conditions of the license they choose before applying it. Licensors should also secure all rights necessary before applying our licenses so that the public can reuse the material as expected. Licensors should clearly mark any material not subject to the license. This includes other CC-licensed material, or material used under an exception or limitation to copyright. More considerations for licensors: https://creativecommons.org/share-your-work/licensing-considerations/  Considerations for the public: By using one of our public licenses, a licensor grants the public permission to use the licensed material under specified terms and conditions. If the licensor's permission is not necessary for any reason\u2013 for example, because of any applicable exception or limitation to copyright\u2013then that use is not regulated by the license. Our licenses grant only permissions under copyright and certain other rights that a licensor has authority to grant. Use of the licensed material may still be restricted for other reasons, including because others have copyright or other rights in the material. A licensor may make special requests, such as asking that all changes be marked or described. Although not required by our licenses, you are encouraged to respect those requests where reasonable. More considerations for the public: https://creativecommons.org/share-your-work/licensing-considerations/  Creative Commons Attribution-NonCommercial 4.0 International Public License  By exercising the Licensed Rights (defined below), You accept and agree to be bound by the terms and conditions of this Creative Commons Attribution-NonCommercial 4.0 International Public License (\"Public License\"). To the extent this Public License may be interpreted as a contract, You are granted the Licensed Rights in consideration of Your acceptance of these terms and conditions, and the Licensor grants You such rights in consideration of benefits the Licensor receives from making the Licensed Material available under these terms and conditions.  Section 1 \u2013 Definitions.  Adapted Material means material subject to Copyright and Similar Rights that is derived from or based upon the Licensed Material and in which the Licensed Material is translated, altered, arranged, transformed, or otherwise modified in a manner requiring permission under the Copyright and Similar Rights held by the Licensor. For purposes of this Public License, where the Licensed Material is a musical work, performance, or sound recording, Adapted Material is always produced where the Licensed Material is synched in timed relation with a moving image.  Adapter's License means the license You apply to Your Copyright and Similar Rights in Your contributions to Adapted Material in accordance with the terms and conditions of this Public License.  BY-NC Compatible License means a license listed at https://creativecommons.org/compatiblelicenses, approved by Creative Commons as essentially the equivalent of this Public License.  Copyright and Similar Rights means copyright and/or similar rights closely related to copyright including, without limitation, performance, broadcast, sound recording, and Sui Generis Database Rights, without regard to how the rights are labeled or categorized. For purposes of this Public License, the rights specified in Section 2(b)(1)-(2) are not Copyright and Similar Rights.  Effective Technological Measures means those measures that, in the absence of proper authority, may not be circumvented under laws fulfilling obligations under Article 11 of the WIPO Copyright Treaty adopted on December 20, 1996, and/or similar international agreements.  Exceptions and Limitations means fair use, fair dealing, and/or any other exception or limitation to Copyright and Similar Rights that applies to Your use of the Licensed Material.  Licensed Material means the artistic or literary work, database, or other material to which the Licensor applied this Public License.  Licensed Rights means the rights granted to You subject to the terms and conditions of this Public License, which are limited to all Copyright and Similar Rights that apply to Your use of the Licensed Material and that the Licensor has authority to license. For purposes of this Public License, simply using the Licensed Material does not automatically result in You acquiring any rights under the License. All rights not expressly granted by the Licensor are reserved. See Section 1(c).  NonCommercial means not primarily intended for or directed towards commercial advantage or monetary compensation. For purposes of this Public License, the exchange of the Licensed Material for other material subject to Copyright and Similar Rights by digital file-sharing or similar means is NonCommercial provided there is no payment of monetary compensation in connection with the exchange.  Share means to provide material to the public by any means or process that requires permission under the Licensed Rights, such as reproduction, public display, public performance, distribution, dissemination, communication, or importation, and to make material available to the public including in ways that members of the public may access the material from a place and at a time individually chosen by them.  Sui Generis Database Rights means rights other than copyright resulting from Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, as amended and/or succeeded, as well as other essentially equivalent rights anywhere in the world.  You means the individual or entity exercising the Licensed Rights under this Public License. Your has a corresponding meaning.  Section 2 \u2013 Scope.  License grant.  Subject to the terms and conditions of this Public License, the Licensor hereby grants You a worldwide, royalty-free, non-sublicensable, non-exclusive, irrevocable license to exercise the Licensed Rights in the Licensed Material to:  reproduce and Share the Licensed Material, in whole or in part, for NonCommercial purposes only; and produce, reproduce, and Share Adapted Material for NonCommercial purposes only.  Exceptions and Limitations. For the avoidance of doubt, where Exceptions and Limitations apply to Your use, this Public License does not apply, and You do not need to comply with its terms and conditions.  Term. The term of this Public License is specified in Section 6(a).  Media and formats; technical modifications allowed. The Licensor authorizes You to exercise the Licensed Rights in all media and formats whether now known or hereafter created, and to make technical modifications necessary to do so. The Licensor waives and/or agrees not to assert any right or authority to forbid You from making technical modifications necessary to exercise the Licensed Rights, including technical modifications necessary to circumvent Effective Technological Measures. For purposes of this Public License, simply making modifications authorized by this Section 2(a)(4) never produces Adapted Material.  Downstream recipients.  Offer from the Licensor \u2013 Licensed Material. Every recipient of the Licensed Material automatically receives an offer from the Licensor to exercise the Licensed Rights under the terms and conditions of this Public License.  Additional offer from the Licensor \u2013 Adapted Material. Every recipient of Adapted Material from You automatically receives an offer from the Licensor to exercise the Licensed Rights in the Adapted Material under the conditions of the Adapter's License You apply.  No downstream restrictions. You may not offer or impose any additional or different terms or conditions on, or apply any Effective Technological Measures to, the Licensed Material if doing so restricts exercise of the Licensed Rights by any recipient of the Licensed Material.  No endorsement. Nothing in this Public License constitutes or may be construed as permission to assert or imply that You are, or that Your use of the Licensed Material is, connected with, or sponsored, endorsed, or granted official status by, the Licensor or others designated to receive attribution as provided in Section 3(a)(1)(A)(i).  Section 3 \u2013 License Conditions.  Your exercise of the Licensed Rights is expressly made subject to the following conditions.  Attribution.  If You Share the Licensed Material (including in modified form), You must:  retain the following if it is supplied by the Licensor with the Licensed Material:  identification of the creator(s) of the Licensed Material and any others designated to receive attribution, in any reasonable manner requested by the Licensor (including by pseudonym if designated); a copyright notice; a notice that refers to this Public License; a notice that refers to the disclaimer of warranties; a URI or hyperlink to the Licensed Material to the extent reasonably practicable;  indicate if You modified the Licensed Material and retain an indication of any previous modifications; and indicate the Licensed Material is licensed under this Public License, and include the text of, or the URI or hyperlink to, this Public License.  NonCommercial Restrictions.  The licensor permits others to copy, distribute and transmit the work. In return, licensees may not use the work for commercial purposes \u2014 unless they get the licensor's permission.  Section 4 \u2013 Sui Generis Database Rights.  Where the Licensed Rights include Sui Generis Database Rights that apply to Your use of the Licensed Material:  for the avoidance of doubt, Section 2(a)(1) grants You the right to extract, reuse, reproduce, and Share all or a substantial portion of the contents of the database; if You include all or a substantial portion of the database contents in a database in which You have Sui Generis Database Rights, then the database in which You have Sui Generis Database Rights (but not its individual contents) is Adapted Material, including for purposes of Section 3(b); and You must comply with the conditions in Section 3(a) if You Share all or a substantial portion of the contents of the database.  For the avoidance of doubt, this Section 4 supplements and does not replace Your obligations under this Public License where the Licensed Rights include other Copyright and Similar Rights.  Section 5 \u2013 Disclaimer of Warranties and Limitation of Liability.  Unless otherwise separately undertaken by the Licensor, to the extent possible, the Licensor offers the Licensed Material as-is and as-available, and makes no representations or warranties of any kind concerning the Licensed Material, whether express, implied, statutory, or other. This includes, without limitation, warranties of title, merchantability, fitness for a particular purpose, non-infringement, absence of latent or other defects, accuracy, or the presence or absence of errors, whether or not known or discoverable. Where disclaimers of warranties are not allowed in full or in part, this disclaimer may not apply to You.  To the extent possible, in no event will the Licensor be liable to You on any legal theory (including, without limitation, negligence) or otherwise for any direct, special, indirect, incidental, consequential, punitive, exemplary, or other losses, costs, expenses, or damages arising out of this Public License or use of the Licensed Material, even if the Licensor has been advised of the possibility of such losses, costs, expenses, or damages. Where a limitation of liability is not allowed in full or in part, this limitation may not apply to You.  The disclaimer of warranties and limitation of liability provided above shall be interpreted in a manner that, to the extent possible, most closely approximates an absolute disclaimer and waiver of all liability.  Section 6 \u2013 Term and Termination.  This Public License applies for the term of the Copyright and Similar Rights licensed here. However, if You fail to comply with this Public License, then Your rights under this Public License terminate automatically.  Where Your right to use the Licensed Material has terminated under Section 6(a), it reinstates:  automatically as of the date the violation is cured, provided it is cured within 30 days of Your discovery of the violation; or upon express reinstatement by the Licensor.  For the avoidance of doubt, this Section 6(b) does not affect any right the Licensor may have to seek remedies for Your violations of this Public License.  For the avoidance of doubt, the Licensor may also offer the Licensed Material under separate terms or conditions or stop distributing the Licensed Material at any time; however, doing so will not terminate this Public License.  Sections 1, 5, 6, 7, and 8 survive termination of this Public License.  Section 7 \u2013 Other Terms and Conditions.  The Licensor shall not be bound by any additional or different terms or conditions communicated by You unless expressly agreed.  Any arrangements, understandings, or agreements regarding the Licensed Material not stated herein are separate from and independent of the terms and conditions of this Public License.  Section 8 \u2013 Interpretation.  For the avoidance of doubt, this Public License does not, and shall not be interpreted to, reduce, limit, restrict, or impose conditions on any use of the Licensed Material that could lawfully be made without permission under this Public License.  To the extent possible, if any provision of this Public License is deemed unenforceable, it shall be automatically reformed to the minimum extent necessary to make it enforceable. If the provision cannot be reformed, it shall be severed from this Public License without affecting the enforceability of the remaining terms and conditions.  No term or condition of this Public License will be waived and no failure to comply consented to unless expressly agreed to by the Licensor.  Nothing in this Public License constitutes or may be interpreted as a limitation upon, or waiver of, any privileges and immunities that apply to the Licensor or You, including from the legal processes of any jurisdiction or authority. ",
    "summary": "This framework provides a comprehensive set of tools and utilities for implementing and experimenting with Extreme Learning Machines using Python and TensorFlow. ELMs are a type of machine learning algorithm known for their simplicity, efficiency, and scalability, making them ideal for various applications, including classification, regression, and feature learning tasks.",
    "version": "1.1",
    "project_urls": {
        "Documentation": "https://kstruniawski.github.io/TfELM",
        "Repository": "https://github.com/KStruniawski/TfELM"
    },
    "split_keywords": [
        "extreme learning machine",
        " tensorflow",
        " python",
        " neural networks",
        " artificial intelligence",
        " machine learning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "377b576e574a67334e0c886945c4c185071ac15f60a57ae8eb1bc2efc09c8efa",
                "md5": "bede9af430c6d04bc57d3ccddce4b70a",
                "sha256": "b5817782ce28bd5753fb7fcaaa5ae0fd5842b5374e42c7a1dd4a8af6f72b1961"
            },
            "downloads": -1,
            "filename": "TfELM-1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bede9af430c6d04bc57d3ccddce4b70a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 118655,
            "upload_time": "2024-08-20T09:28:40",
            "upload_time_iso_8601": "2024-08-20T09:28:40.858229Z",
            "url": "https://files.pythonhosted.org/packages/37/7b/576e574a67334e0c886945c4c185071ac15f60a57ae8eb1bc2efc09c8efa/TfELM-1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cf9c07e485cb5910e0675c6fa042ed9590db31ad8822cb2d5b59ad53534b164e",
                "md5": "ce6cc7a1264692a36fb838fb25b25161",
                "sha256": "ce625555fecb03ba5ab4808f97805d8ba37cdb5ff53c990536c389df0c613cb0"
            },
            "downloads": -1,
            "filename": "tfelm-1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "ce6cc7a1264692a36fb838fb25b25161",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 79607,
            "upload_time": "2024-08-20T09:28:42",
            "upload_time_iso_8601": "2024-08-20T09:28:42.725224Z",
            "url": "https://files.pythonhosted.org/packages/cf/9c/07e485cb5910e0675c6fa042ed9590db31ad8822cb2d5b59ad53534b164e/tfelm-1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-20 09:28:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "KStruniawski",
    "github_project": "TfELM",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "tfelm"
}
        
Elapsed time: 0.34062s