miniMLP


NameminiMLP JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/papaaannn/miniMLP
SummaryImplementation of very small scale Neural Network from scratch.
upload_time2023-03-15 19:52:17
maintainer
docs_urlNone
authorSoumyadip Sarkar
requires_python>=3.6
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # miniMLP

This repository contains an implementation of a small Neural Network in Python, with the ability to specify the number of hidden layers, the number of neurons in each layer, and the activation function, and the optimizer to be used.

## Installation

```
pip install miniMLP
```

## Dependencies

The code requires the following dependencies:
- NumPy
- matplotlib (for visualization)

## Example Usage

Create an instance of the **`MLP`** class, specifying the input size, output size, number of hidden layers, sizes of each hidden layer, and activation function.
```python
import numpy as np
from miniMLP.engine import MLP

mlp = MLP(input_size=2, output_size=1, hidden_layers=3, hidden_sizes=[4, 6, 4], activation='relu')
```
Then, train the MLP using the **`train`** method, providing the training data (an array of inputs and corresponding labels), and optional parameters such as the learning rate, number of epochs, and optimizer.
```python
X_train = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
y_train = np.array([[0], [1], [1], [0]])

mlp.train(X_train, y_train, epochs=2000, learning_rate=0.0001, optimizer='Adam')
```
Finally, use the **`predict`** method to obtain predictions for new data points.
```python
y_pred = mlp.predict(X_new)
```

## Activation Functions

The following activation functions are currently supported:
- Sigmoid
- ReLU
- Tanh
- Softmax
- Leaky ReLU
- ELU
- GELU
- Softplus
- SeLU
- PReLu
- Swish
- Gaussian

## Optimizers

The following optimizers are currently supported:
- Adam
- Stochastic Gradient Descent
- Momentum

## License

This project is licensed under the MIT License. Feel free to use and modify this code for your own projects.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/papaaannn/miniMLP",
    "name": "miniMLP",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "",
    "author": "Soumyadip Sarkar",
    "author_email": "soumyadipsarkar@outlook.com",
    "download_url": "https://files.pythonhosted.org/packages/31/4d/1af8c4ed8e79f031e46b4805775da074efc667807c95c2ecc8393b53840b/miniMLP-0.0.1.tar.gz",
    "platform": null,
    "description": "# miniMLP\r\n\r\nThis repository contains an implementation of a small Neural Network in Python, with the ability to specify the number of hidden layers, the number of neurons in each layer, and the activation function, and the optimizer to be used.\r\n\r\n## Installation\r\n\r\n```\r\npip install miniMLP\r\n```\r\n\r\n## Dependencies\r\n\r\nThe code requires the following dependencies:\r\n- NumPy\r\n- matplotlib (for visualization)\r\n\r\n## Example Usage\r\n\r\nCreate an instance of the **`MLP`** class, specifying the input size, output size, number of hidden layers, sizes of each hidden layer, and activation function.\r\n```python\r\nimport numpy as np\r\nfrom miniMLP.engine import MLP\r\n\r\nmlp = MLP(input_size=2, output_size=1, hidden_layers=3, hidden_sizes=[4, 6, 4], activation='relu')\r\n```\r\nThen, train the MLP using the **`train`** method, providing the training data (an array of inputs and corresponding labels), and optional parameters such as the learning rate, number of epochs, and optimizer.\r\n```python\r\nX_train = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])\r\ny_train = np.array([[0], [1], [1], [0]])\r\n\r\nmlp.train(X_train, y_train, epochs=2000, learning_rate=0.0001, optimizer='Adam')\r\n```\r\nFinally, use the **`predict`** method to obtain predictions for new data points.\r\n```python\r\ny_pred = mlp.predict(X_new)\r\n```\r\n\r\n## Activation Functions\r\n\r\nThe following activation functions are currently supported:\r\n- Sigmoid\r\n- ReLU\r\n- Tanh\r\n- Softmax\r\n- Leaky ReLU\r\n- ELU\r\n- GELU\r\n- Softplus\r\n- SeLU\r\n- PReLu\r\n- Swish\r\n- Gaussian\r\n\r\n## Optimizers\r\n\r\nThe following optimizers are currently supported:\r\n- Adam\r\n- Stochastic Gradient Descent\r\n- Momentum\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License. Feel free to use and modify this code for your own projects.\r\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Implementation of very small scale Neural Network from scratch.",
    "version": "0.0.1",
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "314d1af8c4ed8e79f031e46b4805775da074efc667807c95c2ecc8393b53840b",
                "md5": "df10f6d8eb406daadfb25838d65b5985",
                "sha256": "7a3a1499213d3d57648f8ef39a3ae2038f809165fa207e89dbdbd46bde894cbd"
            },
            "downloads": -1,
            "filename": "miniMLP-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "df10f6d8eb406daadfb25838d65b5985",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 4346,
            "upload_time": "2023-03-15T19:52:17",
            "upload_time_iso_8601": "2023-03-15T19:52:17.495852Z",
            "url": "https://files.pythonhosted.org/packages/31/4d/1af8c4ed8e79f031e46b4805775da074efc667807c95c2ecc8393b53840b/miniMLP-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-03-15 19:52:17",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "github_user": "papaaannn",
    "github_project": "miniMLP",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "minimlp"
}
        
Elapsed time: 0.13419s