curvefit-gd


Namecurvefit-gd JSON
Version 1.3.0 PyPI version JSON
download
home_pageNone
SummaryA gradient descent optimizer that helps fitting multivariate nonlinear curves to data
upload_time2024-09-08 22:07:35
maintainerNone
docs_urlNone
authorNone
requires_python>=3.7
licenseMIT License Copyright (c) 2024 Behzad-amn Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords gradient descent curve fitting nonlinear multivariate machine learning optimization
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Multivariate Nonlinear Gradient Descent Curve Fitting

This package provides a multivariate nonlinear curve fitter that utilizes gradient descent for optimization based on mean square error. It is designed for solving complex curve-fitting problems where the relationship between input data and target variables is nonlinear. The package supports numerical gradient calculation by default but allows users to specify an analytical gradient function for more complex models to improve performance and precision.

## Features
- **Flexible Model Definition**: Users can define their own model functions and gradients (optional).
- **Gradient Descent Optimization**: Automatically uses numerical gradients but can switch to analytical gradients when specified.
- **Data Scaling**: Optional data scaling for more stable optimization.
- **Customizable Learning Rate and Parameters**: Users can adjust the learning rate, decay factor, and other parameters to fine-tune the optimization process.

## Installation
To install the package, use the following command:

```bash
pip install curvefit_gd
```

You can then import the FunctionFitter using the following command

```bash
from curvefit_gd import FunctionFitter
```


## How it works

The optimization process follows these steps:

### 1. Specify the Model Function:
The model function defines the relationship between your input data and the target output. An example model could be a combination of exponential and quadratic terms.

#### Example Model Structure:

$$f(x_1, x_2) = c_0 \cdot e^{x_1} + (1 + c_1 \cdot x_2^2)$$
```python
def model_function(x, coefficients):

    x1, x2 = x[0], x[1]
    func = coefficients[0] * np.exp(x1) + (1 + coefficients[1] * x2**2)
    return func
```
Where x1, x2 are input features, and c0, c1 are the coefficients to be learned.

### 2. Optionally Specify the Gradient Function:

For more complex models, you may want to provide your own gradient function to improve optimization accuracy.

Example Gradient Terms:


To minimize the loss function using gradient descent, the gradients of the model with respect to the coefficients are calculated as follows:

The gradient with respect to (c0) is:

$$\frac{\partial f(x_1, x_2)}{\partial c_0} = e^{x_1}$$

The gradient with respect to (c1) is:

$$\frac{\partial f(x_1, x_2)}{\partial c_1} = x_2^2$$

```python
def gradient_terms(x_data, coefficients):

    term1 = np.exp(x_data[0])
    term2 = x_data[1]**2
    return np.array([term1, term2])
```

### 3. Fit the Model:
Provide your data and the model will ```fit()``` the curve to the data using gradient descent, adjusting coefficients to minimize the error.

### 4. Predict New Values:
After training, use the ```predict()``` method to generate predictions based on new input data.

## Scaling Data

It is highly recommended to scale your data for better stability in the optimization process. However, if you choose to scale your data, you must:

- Use the same scaling parameters for any future input data, ideally through the predict() method.
- If you choose not to scale your data, the coefficients will be easier to interpret but may result in less stable optimization.


## Class and Methods

### `FunctionFitter`
  
The primary class for performing curve fitting using gradient descent.

## Constructor
```python
FunctionFitter(model_func, learning_rate=1e-3, decay_factor=0, max_iterations=100000,user_gradients=None, error_tolerance=1e-5, gradient_tolerance=1e-5)

```
## Parameters

- **model_func** (_callable_): The function defining the relationship between input data and target values. **(required)**
  
- **learning_rate** (_float_): The initial learning rate for the optimizer. Default is `1e-3`. **(optional)**

- **decay_factor** (_float_): Decay factor for the learning rate over iterations. Default is `0`. **(optional)**

- **max_iterations** (_int_): Maximum number of iterations to perform during optimization. Default is `100,000`. **(optional)**

- **user_gradients** (_callable_): A user-defined function for calculating gradients. Default is `None` (numerical gradients will be used). **(optional)**

- **error_tolerance** (_float_): The threshold for convergence based on error reduction. Default is `1e-5`. **(optional)**

- **gradient_tolerance** (_float_): The threshold for convergence based on the gradient's norm. Default is `1e-5`. **(optional)**

---

### `fit(x_data, y_data)`

Fit the model to the input data using gradient descent.

- **x_data** (_numpy.ndarray_): Input data (features).

- **y_data** (_numpy.ndarray_): Target values.

---

### `predict(x_data)`

Generate predictions using the optimized model.

- **x_data** (_numpy.ndarray_): Input data (features).

---

### `get_coefficients()`

Return the optimized coefficients after fitting the model.

---

### `get_error()`

Return the final error (mean squared error) after the optimization process.


## Example Usage

1. Define your model function.
2. (Optional) Define your gradient function for complex models.
3. Fit the model to your data.
4. Use the trained model to make predictions.

For a detailed example, refer to the `gradient_optimizer_example.py` file in this repository.


## Recommendations

- **Data Scaling**: We highly recommend scaling your data for better optimization stability. However, if you scale, the coefficients must be applied to scaled inputs. The `predict()` method handles this if scaling is used.

- **Model Tuning**: Start with a basic set of parameters (learning rate, decay, etc.) and adjust based on model performance.

## License

This project is licensed under the MIT License.


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "curvefit-gd",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": null,
    "keywords": "Gradient Descent, curve fitting, nonlinear, multivariate, machine learning, optimization",
    "author": null,
    "author_email": "Behzad Aminian <aminian.bz@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/9e/f5/8c7c6fdad2500d884a27650f7d6150abb908192e2dd40e9b8ea47ad1422f/curvefit_gd-1.3.0.tar.gz",
    "platform": null,
    "description": "# Multivariate Nonlinear Gradient Descent Curve Fitting\n\nThis package provides a multivariate nonlinear curve fitter that utilizes gradient descent for optimization based on mean square error. It is designed for solving complex curve-fitting problems where the relationship between input data and target variables is nonlinear. The package supports numerical gradient calculation by default but allows users to specify an analytical gradient function for more complex models to improve performance and precision.\n\n## Features\n- **Flexible Model Definition**: Users can define their own model functions and gradients (optional).\n- **Gradient Descent Optimization**: Automatically uses numerical gradients but can switch to analytical gradients when specified.\n- **Data Scaling**: Optional data scaling for more stable optimization.\n- **Customizable Learning Rate and Parameters**: Users can adjust the learning rate, decay factor, and other parameters to fine-tune the optimization process.\n\n## Installation\nTo install the package, use the following command:\n\n```bash\npip install curvefit_gd\n```\n\nYou can then import the FunctionFitter using the following command\n\n```bash\nfrom curvefit_gd import FunctionFitter\n```\n\n\n## How it works\n\nThe optimization process follows these steps:\n\n### 1. Specify the Model Function:\nThe model function defines the relationship between your input data and the target output. An example model could be a combination of exponential and quadratic terms.\n\n#### Example Model Structure:\n\n$$f(x_1, x_2) = c_0 \\cdot e^{x_1} + (1 + c_1 \\cdot x_2^2)$$\n```python\ndef model_function(x, coefficients):\n\n    x1, x2 = x[0], x[1]\n    func = coefficients[0] * np.exp(x1) + (1 + coefficients[1] * x2**2)\n    return func\n```\nWhere x1, x2 are input features, and c0, c1 are the coefficients to be learned.\n\n### 2. Optionally Specify the Gradient Function:\n\nFor more complex models, you may want to provide your own gradient function to improve optimization accuracy.\n\nExample Gradient Terms:\n\n\nTo minimize the loss function using gradient descent, the gradients of the model with respect to the coefficients are calculated as follows:\n\nThe gradient with respect to (c0) is:\n\n$$\\frac{\\partial f(x_1, x_2)}{\\partial c_0} = e^{x_1}$$\n\nThe gradient with respect to (c1) is:\n\n$$\\frac{\\partial f(x_1, x_2)}{\\partial c_1} = x_2^2$$\n\n```python\ndef gradient_terms(x_data, coefficients):\n\n    term1 = np.exp(x_data[0])\n    term2 = x_data[1]**2\n    return np.array([term1, term2])\n```\n\n### 3. Fit the Model:\nProvide your data and the model will ```fit()``` the curve to the data using gradient descent, adjusting coefficients to minimize the error.\n\n### 4. Predict New Values:\nAfter training, use the ```predict()``` method to generate predictions based on new input data.\n\n## Scaling Data\n\nIt is highly recommended to scale your data for better stability in the optimization process. However, if you choose to scale your data, you must:\n\n- Use the same scaling parameters for any future input data, ideally through the predict() method.\n- If you choose not to scale your data, the coefficients will be easier to interpret but may result in less stable optimization.\n\n\n## Class and Methods\n\n### `FunctionFitter`\n  \nThe primary class for performing curve fitting using gradient descent.\n\n## Constructor\n```python\nFunctionFitter(model_func, learning_rate=1e-3, decay_factor=0, max_iterations=100000,user_gradients=None, error_tolerance=1e-5, gradient_tolerance=1e-5)\n\n```\n## Parameters\n\n- **model_func** (_callable_): The function defining the relationship between input data and target values. **(required)**\n  \n- **learning_rate** (_float_): The initial learning rate for the optimizer. Default is `1e-3`. **(optional)**\n\n- **decay_factor** (_float_): Decay factor for the learning rate over iterations. Default is `0`. **(optional)**\n\n- **max_iterations** (_int_): Maximum number of iterations to perform during optimization. Default is `100,000`. **(optional)**\n\n- **user_gradients** (_callable_): A user-defined function for calculating gradients. Default is `None` (numerical gradients will be used). **(optional)**\n\n- **error_tolerance** (_float_): The threshold for convergence based on error reduction. Default is `1e-5`. **(optional)**\n\n- **gradient_tolerance** (_float_): The threshold for convergence based on the gradient's norm. Default is `1e-5`. **(optional)**\n\n---\n\n### `fit(x_data, y_data)`\n\nFit the model to the input data using gradient descent.\n\n- **x_data** (_numpy.ndarray_): Input data (features).\n\n- **y_data** (_numpy.ndarray_): Target values.\n\n---\n\n### `predict(x_data)`\n\nGenerate predictions using the optimized model.\n\n- **x_data** (_numpy.ndarray_): Input data (features).\n\n---\n\n### `get_coefficients()`\n\nReturn the optimized coefficients after fitting the model.\n\n---\n\n### `get_error()`\n\nReturn the final error (mean squared error) after the optimization process.\n\n\n## Example Usage\n\n1. Define your model function.\n2. (Optional) Define your gradient function for complex models.\n3. Fit the model to your data.\n4. Use the trained model to make predictions.\n\nFor a detailed example, refer to the `gradient_optimizer_example.py` file in this repository.\n\n\n## Recommendations\n\n- **Data Scaling**: We highly recommend scaling your data for better optimization stability. However, if you scale, the coefficients must be applied to scaled inputs. The `predict()` method handles this if scaling is used.\n\n- **Model Tuning**: Start with a basic set of parameters (learning rate, decay, etc.) and adjust based on model performance.\n\n## License\n\nThis project is licensed under the MIT License.\n\n",
    "bugtrack_url": null,
    "license": "MIT License  Copyright (c) 2024 Behzad-amn  Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:  The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.  THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. ",
    "summary": "A gradient descent optimizer that helps fitting multivariate nonlinear curves to data",
    "version": "1.3.0",
    "project_urls": null,
    "split_keywords": [
        "gradient descent",
        " curve fitting",
        " nonlinear",
        " multivariate",
        " machine learning",
        " optimization"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ead4769feab84cc8f1c92ba1d9cce9c10abdffae06ed629dbf557dfcca74d2e1",
                "md5": "bd40f30ad104407a1f53e1adaf2ac8b5",
                "sha256": "4d14ec0c78f53823835377f2cfcc1d5d4e5a2bb88420004c9a5b492fc44fe87e"
            },
            "downloads": -1,
            "filename": "curvefit_gd-1.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "bd40f30ad104407a1f53e1adaf2ac8b5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 7070,
            "upload_time": "2024-09-08T22:07:34",
            "upload_time_iso_8601": "2024-09-08T22:07:34.302873Z",
            "url": "https://files.pythonhosted.org/packages/ea/d4/769feab84cc8f1c92ba1d9cce9c10abdffae06ed629dbf557dfcca74d2e1/curvefit_gd-1.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9ef58c7c6fdad2500d884a27650f7d6150abb908192e2dd40e9b8ea47ad1422f",
                "md5": "09f4495c8ca8d110261c6563c3a774a7",
                "sha256": "d9251edb272fa0121c9b6667262f2daf9c908552579c3bfb9a69dd4afd24fefe"
            },
            "downloads": -1,
            "filename": "curvefit_gd-1.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "09f4495c8ca8d110261c6563c3a774a7",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 6440,
            "upload_time": "2024-09-08T22:07:35",
            "upload_time_iso_8601": "2024-09-08T22:07:35.489863Z",
            "url": "https://files.pythonhosted.org/packages/9e/f5/8c7c6fdad2500d884a27650f7d6150abb908192e2dd40e9b8ea47ad1422f/curvefit_gd-1.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-09-08 22:07:35",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "curvefit-gd"
}
        
Elapsed time: 0.37060s