Name | nonlinear-curve-fitter JSON |
Version |
1.2.0
JSON |
| download |
home_page | None |
Summary | A gradient descent optimizer that helps fitting multivariate nonlinear curves to data |
upload_time | 2024-09-06 23:35:51 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.7 |
license | MIT License Copyright (c) 2024 Behzad-amn Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
gradient descent
curve fitting
nonlinear
multivariate
machine learning
optimization
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Multivariate Nonlinear Gradient Descent Curve Fitting
This package provides a multivariate nonlinear curve fitter that utilizes gradient descent for optimization based on mean square error. It is designed for solving complex curve-fitting problems where the relationship between input data and target variables is nonlinear. The package supports numerical gradient calculation by default but allows users to specify an analytical gradient function for more complex models to improve performance and precision.
## Features
- **Flexible Model Definition**: Users can define their own model functions and gradients (optional).
- **Gradient Descent Optimization**: Automatically uses numerical gradients but can switch to analytical gradients when specified.
- **Data Scaling**: Optional data scaling for more stable optimization.
- **Customizable Learning Rate and Parameters**: Users can adjust the learning rate, decay factor, and other parameters to fine-tune the optimization process.
## Installation
To install the package, use the following command:
```bash
pip install nonlinear_curve_fitter
```
## How it works
The optimization process follows these steps:
### 1. Specify the Model Function:
The model function defines the relationship between your input data and the target output. An example model could be a combination of exponential and quadratic terms.
#### Example Model Structure:
$$f(x_1, x_2) = c_0 \cdot e^{x_1} + (1 + c_1 \cdot x_2^2)$$
```python
def model_function(x, coefficients):
x1, x2 = x[0], x[1]
func = coefficients[0] * np.exp(x1) + (1 + coefficients[1] * x2**2)
return func
```
Where x1, x2 are input features, and c0, c1 are the coefficients to be learned.
### 2. Optionally Specify the Gradient Function:
For more complex models, you may want to provide your own gradient function to improve optimization accuracy.
Example Gradient Terms:
To minimize the loss function using gradient descent, the gradients of the model with respect to the coefficients are calculated as follows:
The gradient with respect to (c0) is:
$$\frac{\partial f(x_1, x_2)}{\partial c_0} = e^{x_1}$$
The gradient with respect to (c1) is:
$$\frac{\partial f(x_1, x_2)}{\partial c_1} = x_2^2$$
```python
def gradient_terms(x_data, coefficients):
term1 = np.exp(x_data[0])
term2 = x_data[1]**2
return np.array([term1, term2])
```
### 3. Fit the Model:
Provide your data and the model will ```fit()``` the curve to the data using gradient descent, adjusting coefficients to minimize the error.
### 4. Predict New Values:
After training, use the ```predict()``` method to generate predictions based on new input data.
## Scaling Data
It is highly recommended to scale your data for better stability in the optimization process. However, if you choose to scale your data, you must:
- Use the same scaling parameters for any future input data, ideally through the predict() method.
- If you choose not to scale your data, the coefficients will be easier to interpret but may result in less stable optimization.
## Class and Methods
### `FunctionFitter`
The primary class for performing curve fitting using gradient descent.
## Constructor
```python
FunctionFitter(model_func, learning_rate=1e-3, decay_factor=0, max_iterations=100000,user_gradients=None, error_tolerance=1e-5, gradient_tolerance=1e-5)
```
## Parameters
- **model_func** (_callable_): The function defining the relationship between input data and target values. **(required)**
- **learning_rate** (_float_): The initial learning rate for the optimizer. Default is `1e-3`. **(optional)**
- **decay_factor** (_float_): Decay factor for the learning rate over iterations. Default is `0`. **(optional)**
- **max_iterations** (_int_): Maximum number of iterations to perform during optimization. Default is `100,000`. **(optional)**
- **user_gradients** (_callable_): A user-defined function for calculating gradients. Default is `None` (numerical gradients will be used). **(optional)**
- **error_tolerance** (_float_): The threshold for convergence based on error reduction. Default is `1e-5`. **(optional)**
- **gradient_tolerance** (_float_): The threshold for convergence based on the gradient's norm. Default is `1e-5`. **(optional)**
---
### `fit(x_data, y_data)`
Fit the model to the input data using gradient descent.
- **x_data** (_numpy.ndarray_): Input data (features).
- **y_data** (_numpy.ndarray_): Target values.
---
### `predict(x_data)`
Generate predictions using the optimized model.
- **x_data** (_numpy.ndarray_): Input data (features).
---
### `get_coefficients()`
Return the optimized coefficients after fitting the model.
---
### `get_error()`
Return the final error (mean squared error) after the optimization process.
## Example Usage
1. Define your model function.
2. (Optional) Define your gradient function for complex models.
3. Fit the model to your data.
4. Use the trained model to make predictions.
For a detailed example, refer to the `gradient_optimizer_example.py` file in this repository.
## Recommendations
- **Data Scaling**: We highly recommend scaling your data for better optimization stability. However, if you scale, the coefficients must be applied to scaled inputs. The `predict()` method handles this if scaling is used.
- **Model Tuning**: Start with a basic set of parameters (learning rate, decay, etc.) and adjust based on model performance.
## License
This project is licensed under the MIT License.
Raw data
{
"_id": null,
"home_page": null,
"name": "nonlinear-curve-fitter",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": "Gradient Descent, curve fitting, nonlinear, multivariate, machine learning, optimization",
"author": null,
"author_email": "Behzad Aminian <aminian.bz@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/ae/38/a9a9ddefc42ae0eed500b349e015d4f5e56f0dc9dbae5945c4f6a702823c/nonlinear_curve_fitter-1.2.0.tar.gz",
"platform": null,
"description": "# Multivariate Nonlinear Gradient Descent Curve Fitting\n\nThis package provides a multivariate nonlinear curve fitter that utilizes gradient descent for optimization based on mean square error. It is designed for solving complex curve-fitting problems where the relationship between input data and target variables is nonlinear. The package supports numerical gradient calculation by default but allows users to specify an analytical gradient function for more complex models to improve performance and precision.\n\n## Features\n- **Flexible Model Definition**: Users can define their own model functions and gradients (optional).\n- **Gradient Descent Optimization**: Automatically uses numerical gradients but can switch to analytical gradients when specified.\n- **Data Scaling**: Optional data scaling for more stable optimization.\n- **Customizable Learning Rate and Parameters**: Users can adjust the learning rate, decay factor, and other parameters to fine-tune the optimization process.\n\n## Installation\nTo install the package, use the following command:\n\n```bash\npip install nonlinear_curve_fitter\n```\n\n## How it works\n\nThe optimization process follows these steps:\n\n### 1. Specify the Model Function:\nThe model function defines the relationship between your input data and the target output. An example model could be a combination of exponential and quadratic terms.\n\n#### Example Model Structure:\n\n$$f(x_1, x_2) = c_0 \\cdot e^{x_1} + (1 + c_1 \\cdot x_2^2)$$\n```python\ndef model_function(x, coefficients):\n\n x1, x2 = x[0], x[1]\n func = coefficients[0] * np.exp(x1) + (1 + coefficients[1] * x2**2)\n return func\n```\nWhere x1, x2 are input features, and c0, c1 are the coefficients to be learned.\n\n### 2. Optionally Specify the Gradient Function:\n\nFor more complex models, you may want to provide your own gradient function to improve optimization accuracy.\n\nExample Gradient Terms:\n\n\nTo minimize the loss function using gradient descent, the gradients of the model with respect to the coefficients are calculated as follows:\n\nThe gradient with respect to (c0) is:\n\n$$\\frac{\\partial f(x_1, x_2)}{\\partial c_0} = e^{x_1}$$\n\nThe gradient with respect to (c1) is:\n\n$$\\frac{\\partial f(x_1, x_2)}{\\partial c_1} = x_2^2$$\n\n```python\ndef gradient_terms(x_data, coefficients):\n\n term1 = np.exp(x_data[0])\n term2 = x_data[1]**2\n return np.array([term1, term2])\n```\n\n### 3. Fit the Model:\nProvide your data and the model will ```fit()``` the curve to the data using gradient descent, adjusting coefficients to minimize the error.\n\n### 4. Predict New Values:\nAfter training, use the ```predict()``` method to generate predictions based on new input data.\n\n## Scaling Data\n\nIt is highly recommended to scale your data for better stability in the optimization process. However, if you choose to scale your data, you must:\n\n- Use the same scaling parameters for any future input data, ideally through the predict() method.\n- If you choose not to scale your data, the coefficients will be easier to interpret but may result in less stable optimization.\n\n\n## Class and Methods\n\n### `FunctionFitter`\n \nThe primary class for performing curve fitting using gradient descent.\n\n## Constructor\n```python\nFunctionFitter(model_func, learning_rate=1e-3, decay_factor=0, max_iterations=100000,user_gradients=None, error_tolerance=1e-5, gradient_tolerance=1e-5)\n\n```\n## Parameters\n\n- **model_func** (_callable_): The function defining the relationship between input data and target values. **(required)**\n \n- **learning_rate** (_float_): The initial learning rate for the optimizer. Default is `1e-3`. **(optional)**\n\n- **decay_factor** (_float_): Decay factor for the learning rate over iterations. Default is `0`. **(optional)**\n\n- **max_iterations** (_int_): Maximum number of iterations to perform during optimization. Default is `100,000`. **(optional)**\n\n- **user_gradients** (_callable_): A user-defined function for calculating gradients. Default is `None` (numerical gradients will be used). **(optional)**\n\n- **error_tolerance** (_float_): The threshold for convergence based on error reduction. Default is `1e-5`. **(optional)**\n\n- **gradient_tolerance** (_float_): The threshold for convergence based on the gradient's norm. Default is `1e-5`. **(optional)**\n\n---\n\n### `fit(x_data, y_data)`\n\nFit the model to the input data using gradient descent.\n\n- **x_data** (_numpy.ndarray_): Input data (features).\n\n- **y_data** (_numpy.ndarray_): Target values.\n\n---\n\n### `predict(x_data)`\n\nGenerate predictions using the optimized model.\n\n- **x_data** (_numpy.ndarray_): Input data (features).\n\n---\n\n### `get_coefficients()`\n\nReturn the optimized coefficients after fitting the model.\n\n---\n\n### `get_error()`\n\nReturn the final error (mean squared error) after the optimization process.\n\n\n## Example Usage\n\n1. Define your model function.\n2. (Optional) Define your gradient function for complex models.\n3. Fit the model to your data.\n4. Use the trained model to make predictions.\n\nFor a detailed example, refer to the `gradient_optimizer_example.py` file in this repository.\n\n\n## Recommendations\n\n- **Data Scaling**: We highly recommend scaling your data for better optimization stability. However, if you scale, the coefficients must be applied to scaled inputs. The `predict()` method handles this if scaling is used.\n\n- **Model Tuning**: Start with a basic set of parameters (learning rate, decay, etc.) and adjust based on model performance.\n\n## License\n\nThis project is licensed under the MIT License.\n\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 Behzad-amn Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
"summary": "A gradient descent optimizer that helps fitting multivariate nonlinear curves to data",
"version": "1.2.0",
"project_urls": null,
"split_keywords": [
"gradient descent",
" curve fitting",
" nonlinear",
" multivariate",
" machine learning",
" optimization"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "dd77532197e073148c697ed9df2d7480475fe0ffb47abc086c1fd923cae189c9",
"md5": "15988b8ed95a3f1c681c2912825a50d8",
"sha256": "2b3043b7dbc47b938549fdae8b3316ae2ebbfe7d19f98f56d00a5f08a5863a87"
},
"downloads": -1,
"filename": "nonlinear_curve_fitter-1.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "15988b8ed95a3f1c681c2912825a50d8",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 4916,
"upload_time": "2024-09-06T23:35:49",
"upload_time_iso_8601": "2024-09-06T23:35:49.932438Z",
"url": "https://files.pythonhosted.org/packages/dd/77/532197e073148c697ed9df2d7480475fe0ffb47abc086c1fd923cae189c9/nonlinear_curve_fitter-1.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ae38a9a9ddefc42ae0eed500b349e015d4f5e56f0dc9dbae5945c4f6a702823c",
"md5": "fc0c59cf52890a4d02be3c451d31030e",
"sha256": "6a3a33ae0d648f9924a5dbf3e6fe5f4aba316f376af3e73915bce5e4a9a8e3ed"
},
"downloads": -1,
"filename": "nonlinear_curve_fitter-1.2.0.tar.gz",
"has_sig": false,
"md5_digest": "fc0c59cf52890a4d02be3c451d31030e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 4862,
"upload_time": "2024-09-06T23:35:51",
"upload_time_iso_8601": "2024-09-06T23:35:51.127032Z",
"url": "https://files.pythonhosted.org/packages/ae/38/a9a9ddefc42ae0eed500b349e015d4f5e56f0dc9dbae5945c4f6a702823c/nonlinear_curve_fitter-1.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-06 23:35:51",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "nonlinear-curve-fitter"
}