ak-pynn


Nameak-pynn JSON
Version 0.1.8 PyPI version JSON
download
home_page
SummaryA simplistic and efficient pure-python neural network library that allows to build multilayer neural network with ease.
upload_time2023-05-28 19:44:49
maintainer
docs_urlNone
authorAnkit kohli
requires_python
licenseMIT
keywords neural network pure python ankit_nn machine learning ml deep learning deepl mlp perceptron ankit kohli ak_pynn
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# MultiLayer Neural Network (ak_pynn)
A simplistic and efficient pure-python neural network library, which can be used to build , visualize and deploy deep learning ANN models. It is optimized for best performance.

- Optimized for performance
- Better Visualization
- Cross platform



## Authors

- [@ankit_kohli](https://www.github.com/ankit869)


## License

[MIT](https://choosealicense.com/licenses/mit/)


## Support

For support, email contact.ankitkohli@gmail.com


## Features

- [x] Efficient implementations of activation functions and their gradients
    - [x]  Sigmoid
    - [x]  ReLU
    - [x]  Leaky ReLU
    - [x]  Softmax  
    - [x]  Softplus  
    - [x]  Tanh 
    - [x]  Elu  
    - [x]  Linear 
- [x] Efficient implementations of loss functions and their gradients
    - [x]  Mean squared error 
    - [x]  Mean absolute error
    - [x]  Binary cross entropy  
    - [x]  Categorical cross entropy  
- [x] Several methods for weights initialization
    - [x]  ```'random uniform'```, ```'random normal'```
    - [x]  ```'Glorot Uniform'```, ```'Glorot Normal'```
    - [x]  ```'He Uniform'```,```'He Normal'```

- [x] Neural network optimization using 
    - [x]  Gradient Descent (Batch/ SGD / Mini-Batch)
    - [x]  Momentum
    - [x]  Adagrad
    - [x]  RMSprop
    - [x]  Adam

- [x] Regularizations
    - [x]  L1 Norm
    - [x]  L2 Norm
    - [x]  L1_L2 Norm
    - [x]  Dropouts

- [x] Batch Normalization
- [x] Early Stopping
- [x] Validation Splits
- [x] Predict Scores
## Installation
Install the release (stable) version from PyPi
```
pip install ak-pynn
```

## Usage/Examples

Import
```python
from ak_pynn.mlp import MLP
```

Usage 
```python
model = MLP()
model.add_layer(4,input_layer=True)
model.add_layer(10,activation_function='relu',batch_norm=True)
model.add_layer(10,activation_function='relu',dropouts=True)
model.add_layer(10,activation_function='relu')
model.add_layer(3,activation_function='softmax',output_layer=True)
model.compile_model(optimizer='Adam',loss_function='mse',metrics=['mse','accuracy'])
```
Output
```

                                ( MODEL SUMMARY )                        
        
        ===================================================================
               Layer           Activation    Output Shape      Params    
        ===================================================================

               Input             linear       (None, 4)          0       
        -------------------------------------------------------------------

               Dense              relu        (None, 10)         50      
        -------------------------------------------------------------------

         BatchNormalization       None        (None, 10)         40      
        -------------------------------------------------------------------

               Dense              relu        (None, 10)        110      
        -------------------------------------------------------------------

              Dropout             None        (None, 10)         0       
        -------------------------------------------------------------------

               Dense              relu        (None, 10)        110      
        -------------------------------------------------------------------

               Output           softmax       (None, 3)          33      
        -------------------------------------------------------------------

        ===================================================================

        Total Params  - 343
        Trainable Params  - 323
        Non-Trainable Params  - 20
        ___________________________________________________________________
              
```
Visualizing model
```python
model.visualize()
```

![App Screenshot](https://drive.google.com/uc?id=1VHFYmo8ufV2_J0DuBvhipFNH1ezLQZIs)

Training the model
```python
model.fit(X_train, Y_train,epochs=200,batch_size=32,verbose=False,early_stopping=False,patience=3,validation_split=0.2)
model.predict_scores(X_test,Y_test,metrics=['accuracy','precision','macro_recall'])
plt.plot(model.history['Val_Losses'])
plt.plot(model.history['Losses'])

```
## TESTS


[@mnist_test](https://github.com/ankit869/ak_pynn/blob/main/mnist_test.ipynb)

[@iris_test](https://github.com/ankit869/ak_pynn/blob/main/iris_test.ipynb)

[@mlp_demo](https://github.com/ankit869/ak_pynn/blob/main/mlp_demo.ipynb)
## Citation
If you use this library and would like to cite it, you can use:
```
Ankit kohli, "ak-pynn: Neural Network libray", 2023. [Online]. Available: https://github.com/ankit869/ak-pynn. [Accessed: DD- Month- 20YY].
```
or:
```
@Misc{,
  author = {Ankit kohli},
  title  = {ak-pynn: Neural Network libray},
  month  = May,
  year   = {2023},
  note   = {Online; accessed <today>},
  url    = {https://github.com/ankit869/ak-pynn},
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "ak-pynn",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "neural network,pure python,ankit_nn,machine learning,ML,deep learning,deepL,MLP,perceptron,ankit kohli,ak_pynn",
    "author": "Ankit kohli",
    "author_email": "<contact.ankitkohli@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/46/5f/42c8e950376e88a25406fa849d9a30651cfd70ab373b19d604633e5b09a4/ak_pynn-0.1.8.tar.gz",
    "platform": null,
    "description": "\r\n# MultiLayer Neural Network (ak_pynn)\r\nA simplistic and efficient pure-python neural network library, which can be used to build , visualize and deploy deep learning ANN models. It is optimized for best performance.\r\n\r\n- Optimized for performance\r\n- Better Visualization\r\n- Cross platform\r\n\r\n\r\n\r\n## Authors\r\n\r\n- [@ankit_kohli](https://www.github.com/ankit869)\r\n\r\n\r\n## License\r\n\r\n[MIT](https://choosealicense.com/licenses/mit/)\r\n\r\n\r\n## Support\r\n\r\nFor support, email contact.ankitkohli@gmail.com\r\n\r\n\r\n## Features\r\n\r\n- [x] Efficient implementations of activation functions and their gradients\r\n    - [x]  Sigmoid\r\n    - [x]  ReLU\r\n    - [x]  Leaky ReLU\r\n    - [x]  Softmax  \r\n    - [x]  Softplus  \r\n    - [x]  Tanh \r\n    - [x]  Elu  \r\n    - [x]  Linear \r\n- [x] Efficient implementations of loss functions and their gradients\r\n    - [x]  Mean squared error \r\n    - [x]  Mean absolute error\r\n    - [x]  Binary cross entropy  \r\n    - [x]  Categorical cross entropy  \r\n- [x] Several methods for weights initialization\r\n    - [x]  ```'random uniform'```, ```'random normal'```\r\n    - [x]  ```'Glorot Uniform'```, ```'Glorot Normal'```\r\n    - [x]  ```'He Uniform'```,```'He Normal'```\r\n\r\n- [x] Neural network optimization using \r\n    - [x]  Gradient Descent (Batch/ SGD / Mini-Batch)\r\n    - [x]  Momentum\r\n    - [x]  Adagrad\r\n    - [x]  RMSprop\r\n    - [x]  Adam\r\n\r\n- [x] Regularizations\r\n    - [x]  L1 Norm\r\n    - [x]  L2 Norm\r\n    - [x]  L1_L2 Norm\r\n    - [x]  Dropouts\r\n\r\n- [x] Batch Normalization\r\n- [x] Early Stopping\r\n- [x] Validation Splits\r\n- [x] Predict Scores\r\n## Installation\r\nInstall the release (stable) version from PyPi\r\n```\r\npip install ak-pynn\r\n```\r\n\r\n## Usage/Examples\r\n\r\nImport\r\n```python\r\nfrom ak_pynn.mlp import MLP\r\n```\r\n\r\nUsage \r\n```python\r\nmodel = MLP()\r\nmodel.add_layer(4,input_layer=True)\r\nmodel.add_layer(10,activation_function='relu',batch_norm=True)\r\nmodel.add_layer(10,activation_function='relu',dropouts=True)\r\nmodel.add_layer(10,activation_function='relu')\r\nmodel.add_layer(3,activation_function='softmax',output_layer=True)\r\nmodel.compile_model(optimizer='Adam',loss_function='mse',metrics=['mse','accuracy'])\r\n```\r\nOutput\r\n```\r\n\r\n                                ( MODEL SUMMARY )                        \r\n        \r\n        ===================================================================\r\n               Layer           Activation    Output Shape      Params    \r\n        ===================================================================\r\n\r\n               Input             linear       (None, 4)          0       \r\n        -------------------------------------------------------------------\r\n\r\n               Dense              relu        (None, 10)         50      \r\n        -------------------------------------------------------------------\r\n\r\n         BatchNormalization       None        (None, 10)         40      \r\n        -------------------------------------------------------------------\r\n\r\n               Dense              relu        (None, 10)        110      \r\n        -------------------------------------------------------------------\r\n\r\n              Dropout             None        (None, 10)         0       \r\n        -------------------------------------------------------------------\r\n\r\n               Dense              relu        (None, 10)        110      \r\n        -------------------------------------------------------------------\r\n\r\n               Output           softmax       (None, 3)          33      \r\n        -------------------------------------------------------------------\r\n\r\n        ===================================================================\r\n\r\n        Total Params  - 343\r\n        Trainable Params  - 323\r\n        Non-Trainable Params  - 20\r\n        ___________________________________________________________________\r\n              \r\n```\r\nVisualizing model\r\n```python\r\nmodel.visualize()\r\n```\r\n\r\n![App Screenshot](https://drive.google.com/uc?id=1VHFYmo8ufV2_J0DuBvhipFNH1ezLQZIs)\r\n\r\nTraining the model\r\n```python\r\nmodel.fit(X_train, Y_train,epochs=200,batch_size=32,verbose=False,early_stopping=False,patience=3,validation_split=0.2)\r\nmodel.predict_scores(X_test,Y_test,metrics=['accuracy','precision','macro_recall'])\r\nplt.plot(model.history['Val_Losses'])\r\nplt.plot(model.history['Losses'])\r\n\r\n```\r\n## TESTS\r\n\r\n\r\n[@mnist_test](https://github.com/ankit869/ak_pynn/blob/main/mnist_test.ipynb)\r\n\r\n[@iris_test](https://github.com/ankit869/ak_pynn/blob/main/iris_test.ipynb)\r\n\r\n[@mlp_demo](https://github.com/ankit869/ak_pynn/blob/main/mlp_demo.ipynb)\r\n## Citation\r\nIf you use this library and would like to cite it, you can use:\r\n```\r\nAnkit kohli, \"ak-pynn: Neural Network libray\", 2023. [Online]. Available: https://github.com/ankit869/ak-pynn. [Accessed: DD- Month- 20YY].\r\n```\r\nor:\r\n```\r\n@Misc{,\r\n  author = {Ankit kohli},\r\n  title  = {ak-pynn: Neural Network libray},\r\n  month  = May,\r\n  year   = {2023},\r\n  note   = {Online; accessed <today>},\r\n  url    = {https://github.com/ankit869/ak-pynn},\r\n}\r\n```\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A simplistic and efficient pure-python neural network library that allows to build multilayer neural network with ease.",
    "version": "0.1.8",
    "project_urls": null,
    "split_keywords": [
        "neural network",
        "pure python",
        "ankit_nn",
        "machine learning",
        "ml",
        "deep learning",
        "deepl",
        "mlp",
        "perceptron",
        "ankit kohli",
        "ak_pynn"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4e08ccd4ee4289ebb27fdc1770077df5adec52cdd63f0a225814a779e0a3109a",
                "md5": "95ab3a9a64385bf66724745d11bd8db7",
                "sha256": "31b7bf8332ec94e686cc4b2c61bea840b446f9c3c7801fbb6f25a731a0496db5"
            },
            "downloads": -1,
            "filename": "ak_pynn-0.1.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "95ab3a9a64385bf66724745d11bd8db7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 15122,
            "upload_time": "2023-05-28T19:44:46",
            "upload_time_iso_8601": "2023-05-28T19:44:46.185856Z",
            "url": "https://files.pythonhosted.org/packages/4e/08/ccd4ee4289ebb27fdc1770077df5adec52cdd63f0a225814a779e0a3109a/ak_pynn-0.1.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "465f42c8e950376e88a25406fa849d9a30651cfd70ab373b19d604633e5b09a4",
                "md5": "67e8bddb2705a4462aacd7309d0c5053",
                "sha256": "9b0b618102724a5f21b44decd092f7b10a52a886e8d18f5d6db656000add2619"
            },
            "downloads": -1,
            "filename": "ak_pynn-0.1.8.tar.gz",
            "has_sig": false,
            "md5_digest": "67e8bddb2705a4462aacd7309d0c5053",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 16670,
            "upload_time": "2023-05-28T19:44:49",
            "upload_time_iso_8601": "2023-05-28T19:44:49.343777Z",
            "url": "https://files.pythonhosted.org/packages/46/5f/42c8e950376e88a25406fa849d9a30651cfd70ab373b19d604633e5b09a4/ak_pynn-0.1.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-28 19:44:49",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ak-pynn"
}
        
Elapsed time: 0.07230s