# **Adaptive Power Neurons**
**Adaptive Power Neurons** is a Python library for building machine learning models using adaptive power perceptrons. These perceptrons dynamically adjust their polynomial feature power and input indices, enabling the model to learn complex patterns effectively.
The library is designed for both regression and classification tasks and supports multi-layer neural networks with adaptive neurons.
---
## **Features**
- **Dynamic Adaptation**: Perceptrons adjust their polynomial degree (power) and input index bias during training.
- **Polynomial Feature Expansion**: Supports automatic polynomial feature generation up to a specified degree.
- **Index Bias Adjustment**: Incorporates adjustable input bias for feature shifts.
- **Multi-Layer Support**: Create flexible, multi-layer neural networks.
- **Customizable Optimizer**: Fine-tune hyperparameters like learning rate, polynomial power, and indexing rate dynamically.
---
## **Mathematical Overview**
### 1. **Polynomial Feature Expansion**
Each perceptron transforms the input into a polynomial feature vector:
\[
\phi(x) = [x^1, x^2, \dots, x^p]
\]
Where \( p \) is the maximum power specified for the perceptron.
### 2. **Weighted Output**
The perceptron computes a weighted sum of the polynomial features:
\[
z = w_1 \phi(x_1) + w_2 \phi(x_2) + \dots + w_n \phi(x_n) + b
\]
### 3. **Loss Function**
The library uses **Mean Squared Error (MSE)** for regression:
\[
\text{MSE} = \frac{1}{N} \sum_{i=1}^{N} \left( y_i - \hat{y}_i \right)^2
\]
For classification, a step function is used:
\[
\hat{y} =
\begin{cases}
1 & \text{if } z \geq 0 \\
0 & \text{if } z < 0
\end{cases}
\]
### 4. **Index Bias Adjustment**
An adjustable index bias \( \delta \) shifts the input features:
\[
x_{\text{adjusted}} = x + \delta
\]
### 5. **Weight Updates**
Weights, biases, and index bias are updated using gradient descent:
\[
w_i = w_i - \eta \cdot \frac{\partial \text{MSE}}{\partial w_i}, \quad
b = b - \eta \cdot \frac{\partial \text{MSE}}{\partial b}, \quad
\delta = \delta - \eta \cdot \frac{\partial \text{MSE}}{\partial \delta}
\]
---
## **Installation**
To install the library, clone the repository and install it locally:
```bash
pip install adaptive-power-neurons
# Example Usage for Adaptive Power Neurons
import numpy as np
from adaptive_power_neurons import AdaptivePowerModel, SGD, DenseLayer
# Hyperparameters
input_dim = 3 # Number of input features
output_dim = 2 # Number of output neurons
max_power = 2 # Max power for the neurons
learning_rate = 0.001 # Learning rate for the optimizer
indexing_rate = 0.01 # Indexing rate
# Create SGD optimizer
optimizer = SGD(learning_rate)
# Create AdaptivePowerModel and add layers
model = AdaptivePowerModel()
model.add(DenseLayer(input_dim, 1, max_power, optimizer, indexing_rate, activation="relu"))
model.add(DenseLayer(1, output_dim, max_power, optimizer, indexing_rate, activation="sigmoid"))
# Dummy dataset
x = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0]]) # Input features
y = np.array([[0.5, 1.0], [1.0, 0.0], [0.0, 1.0]]) # Target labels
# Train the model
model.train(x, y, epochs=100, batch_size=1)
model.predict_(np.array([[1, 2, 3]]))
Raw data
{
"_id": null,
"home_page": "https://github.com/Dedeep007/adaptive-power-neurons",
"name": "adaptive-power-neurons",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": "neural network, regression, adaptive power neurons, machine learning",
"author": "Dedeep Vasireddy",
"author_email": "vasireddydedeep@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/55/83/b9952a213de84808a55c007e4783bcc0c8e80274d733f4a99b5e82d8368a/adaptive_power_neurons-0.4.0.tar.gz",
"platform": null,
"description": "# **Adaptive Power Neurons**\n\n**Adaptive Power Neurons** is a Python library for building machine learning models using adaptive power perceptrons. These perceptrons dynamically adjust their polynomial feature power and input indices, enabling the model to learn complex patterns effectively. \n\nThe library is designed for both regression and classification tasks and supports multi-layer neural networks with adaptive neurons.\n\n---\n\n## **Features**\n\n- **Dynamic Adaptation**: Perceptrons adjust their polynomial degree (power) and input index bias during training.\n- **Polynomial Feature Expansion**: Supports automatic polynomial feature generation up to a specified degree.\n- **Index Bias Adjustment**: Incorporates adjustable input bias for feature shifts.\n- **Multi-Layer Support**: Create flexible, multi-layer neural networks.\n- **Customizable Optimizer**: Fine-tune hyperparameters like learning rate, polynomial power, and indexing rate dynamically.\n\n---\n\n## **Mathematical Overview**\n\n### 1. **Polynomial Feature Expansion**\nEach perceptron transforms the input into a polynomial feature vector:\n\\[\n\\phi(x) = [x^1, x^2, \\dots, x^p]\n\\]\nWhere \\( p \\) is the maximum power specified for the perceptron.\n\n### 2. **Weighted Output**\nThe perceptron computes a weighted sum of the polynomial features:\n\\[\nz = w_1 \\phi(x_1) + w_2 \\phi(x_2) + \\dots + w_n \\phi(x_n) + b\n\\]\n\n### 3. **Loss Function**\nThe library uses **Mean Squared Error (MSE)** for regression:\n\\[\n\\text{MSE} = \\frac{1}{N} \\sum_{i=1}^{N} \\left( y_i - \\hat{y}_i \\right)^2\n\\]\n\nFor classification, a step function is used:\n\\[\n\\hat{y} = \n\\begin{cases} \n1 & \\text{if } z \\geq 0 \\\\\n0 & \\text{if } z < 0\n\\end{cases}\n\\]\n\n### 4. **Index Bias Adjustment**\nAn adjustable index bias \\( \\delta \\) shifts the input features:\n\\[\nx_{\\text{adjusted}} = x + \\delta\n\\]\n\n### 5. **Weight Updates**\nWeights, biases, and index bias are updated using gradient descent:\n\\[\nw_i = w_i - \\eta \\cdot \\frac{\\partial \\text{MSE}}{\\partial w_i}, \\quad \nb = b - \\eta \\cdot \\frac{\\partial \\text{MSE}}{\\partial b}, \\quad\n\\delta = \\delta - \\eta \\cdot \\frac{\\partial \\text{MSE}}{\\partial \\delta}\n\\]\n\n---\n\n## **Installation**\n\nTo install the library, clone the repository and install it locally:\n```bash\npip install adaptive-power-neurons\n\n# Example Usage for Adaptive Power Neurons\n\nimport numpy as np\nfrom adaptive_power_neurons import AdaptivePowerModel, SGD, DenseLayer\n\n# Hyperparameters\ninput_dim = 3 # Number of input features\noutput_dim = 2 # Number of output neurons\nmax_power = 2 # Max power for the neurons\nlearning_rate = 0.001 # Learning rate for the optimizer\nindexing_rate = 0.01 # Indexing rate\n\n# Create SGD optimizer\noptimizer = SGD(learning_rate)\n\n# Create AdaptivePowerModel and add layers\nmodel = AdaptivePowerModel()\nmodel.add(DenseLayer(input_dim, 1, max_power, optimizer, indexing_rate, activation=\"relu\"))\nmodel.add(DenseLayer(1, output_dim, max_power, optimizer, indexing_rate, activation=\"sigmoid\"))\n\n# Dummy dataset\nx = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0], [7.0, 8.0, 9.0]]) # Input features\ny = np.array([[0.5, 1.0], [1.0, 0.0], [0.0, 1.0]]) # Target labels\n\n# Train the model\nmodel.train(x, y, epochs=100, batch_size=1)\n\nmodel.predict_(np.array([[1, 2, 3]]))\n\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "A neural network model using adaptive power neurons for regression and classification",
"version": "0.4.0",
"project_urls": {
"Documentation": "https://github.com/Dedeep007/adaptive-power-neurons/wiki",
"Homepage": "https://github.com/Dedeep007/adaptive-power-neurons",
"Source": "https://github.com/Dedeep007/adaptive-power-neurons"
},
"split_keywords": [
"neural network",
" regression",
" adaptive power neurons",
" machine learning"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "d97c1169ce24e6eb60679ef41f4c7df57634a81dd31c8caa2a45afb01e7fc5ee",
"md5": "ed4e0a8761329e2a53a7a5ef999a5eb5",
"sha256": "72265590b725376ba88bd5bd06df9f927ad29b67d251eaae9eb4cc89b1780eac"
},
"downloads": -1,
"filename": "adaptive_power_neurons-0.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "ed4e0a8761329e2a53a7a5ef999a5eb5",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 7672,
"upload_time": "2024-12-08T10:17:58",
"upload_time_iso_8601": "2024-12-08T10:17:58.855897Z",
"url": "https://files.pythonhosted.org/packages/d9/7c/1169ce24e6eb60679ef41f4c7df57634a81dd31c8caa2a45afb01e7fc5ee/adaptive_power_neurons-0.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "5583b9952a213de84808a55c007e4783bcc0c8e80274d733f4a99b5e82d8368a",
"md5": "9d8fe5a84359e47a5c4a5053c65e9a4b",
"sha256": "1b5d27ecc50b6f6a050a654df9b06b3ea51023804d9f486a8d536456a9466478"
},
"downloads": -1,
"filename": "adaptive_power_neurons-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "9d8fe5a84359e47a5c4a5053c65e9a4b",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 6198,
"upload_time": "2024-12-08T10:18:00",
"upload_time_iso_8601": "2024-12-08T10:18:00.560492Z",
"url": "https://files.pythonhosted.org/packages/55/83/b9952a213de84808a55c007e4783bcc0c8e80274d733f4a99b5e82d8368a/adaptive_power_neurons-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-08 10:18:00",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Dedeep007",
"github_project": "adaptive-power-neurons",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "numpy",
"specs": [
[
">=",
"1.21.0"
]
]
}
],
"lcname": "adaptive-power-neurons"
}