jss-optimizer


Namejss-optimizer JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummaryNone
upload_time2024-04-07 06:58:12
maintainerNone
docs_urlNone
authorJegadit S Saravanan
requires_pythonNone
licenseNone
keywords machine-learning parameter optimization genetic algorithm simulated annealing
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
### Based on the paper: J. S. Saravanan and A. Mahadevan, "AI based parameter estimation of ML model using Hybrid of Genetic Algorithm and Simulated Annealing," 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), Delhi, India, 2023, pp. 1-5, doi: 10.1109/ICCCNT56998.2023.10308077. 

# Hyperparameter Optimization with Genetic Algorithm and Simulated Annealing

This repository contains a Python package `jss_optimizer` for optimizing hyperparameters using genetic algorithm (GA) and simulated annealing (SA) hybrid optimization algorithm. 

## Installation

You can install the package using pip:

```bash
pip install jss_optimizer

```

# Usage Example

```python
from jss_optimizer.jss_optimizer import HyperparameterOptimizer
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.ensemble import RandomForestClassifier

# Load dataset
df = pd.read_csv('dataset/heart_v2.csv')
X = df.drop('heart disease', axis=1)
y = df['heart disease']
X_train, X_test, y_train, y_test = train_test_split(X, y, train_size=0.7, random_state=42)

# Define model and parameters
model = RandomForestClassifier
params = ['max_depth', 'min_samples_leaf', 'n_estimators']

# Create an instance of HyperparameterOptimizer
optimizer = HyperparameterOptimizer(model, params)

# Optimize hyperparameters using genetic algorithm
best_solution_genetic = optimizer.optimize(X_train, y_train, X_test, y_test)
print('Best solution found by genetic algorithm:', best_solution_genetic)

# NOTE
# Most of the time, genetic algorithm itself could give an optimal solution. But it could also get caught in a local optima. 
# To aviod such senarios, further optimization with simulated annealing is recommended.
# Use the solution that you see fit is optimal.  

# Perform simulated annealing 
best_solution_simulated_annealing = optimizer.simulate_annealing(best_solution_genetic, X_train, y_train, X_test, y_test)
print('Best solution found by GA-SA hybrid optimization algorithm:', best_solution_simulated_annealing)

```

## works are under progress to extend it to work with every data and model in any given senario

## Version Logs
### version: 0.1.1 - This will work only with Random Forest Classifier on any dataset. 
### version: 0.1.2 - Same as version: 0.1.1. Added improvements & support for train-test spitting with proper score metrics    

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "jss-optimizer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "machine-learning, parameter optimization, genetic algorithm, simulated annealing",
    "author": "Jegadit S Saravanan",
    "author_email": "jegaditssaravanan@yahoo.com",
    "download_url": "https://files.pythonhosted.org/packages/2f/bd/cb5840e5c26311050ec295e021a109e0999649c37641718009b4501f0467/jss_optimizer-0.1.2.tar.gz",
    "platform": null,
    "description": "\n### Based on the paper: J. S. Saravanan and A. Mahadevan, \"AI based parameter estimation of ML model using Hybrid of Genetic Algorithm and Simulated Annealing,\" 2023 14th International Conference on Computing Communication and Networking Technologies (ICCCNT), Delhi, India, 2023, pp. 1-5, doi: 10.1109/ICCCNT56998.2023.10308077. \n\n# Hyperparameter Optimization with Genetic Algorithm and Simulated Annealing\n\nThis repository contains a Python package `jss_optimizer` for optimizing hyperparameters using genetic algorithm (GA) and simulated annealing (SA) hybrid optimization algorithm. \n\n## Installation\n\nYou can install the package using pip:\n\n```bash\npip install jss_optimizer\n\n```\n\n# Usage Example\n\n```python\nfrom jss_optimizer.jss_optimizer import HyperparameterOptimizer\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.ensemble import RandomForestClassifier\n\n# Load dataset\ndf = pd.read_csv('dataset/heart_v2.csv')\nX = df.drop('heart disease', axis=1)\ny = df['heart disease']\nX_train, X_test, y_train, y_test = train_test_split(X, y, train_size=0.7, random_state=42)\n\n# Define model and parameters\nmodel = RandomForestClassifier\nparams = ['max_depth', 'min_samples_leaf', 'n_estimators']\n\n# Create an instance of HyperparameterOptimizer\noptimizer = HyperparameterOptimizer(model, params)\n\n# Optimize hyperparameters using genetic algorithm\nbest_solution_genetic = optimizer.optimize(X_train, y_train, X_test, y_test)\nprint('Best solution found by genetic algorithm:', best_solution_genetic)\n\n# NOTE\n# Most of the time, genetic algorithm itself could give an optimal solution. But it could also get caught in a local optima. \n# To aviod such senarios, further optimization with simulated annealing is recommended.\n# Use the solution that you see fit is optimal.  \n\n# Perform simulated annealing \nbest_solution_simulated_annealing = optimizer.simulate_annealing(best_solution_genetic, X_train, y_train, X_test, y_test)\nprint('Best solution found by GA-SA hybrid optimization algorithm:', best_solution_simulated_annealing)\n\n```\n\n## works are under progress to extend it to work with every data and model in any given senario\n\n## Version Logs\n### version: 0.1.1 - This will work only with Random Forest Classifier on any dataset. \n### version: 0.1.2 - Same as version: 0.1.1. Added improvements & support for train-test spitting with proper score metrics    \n",
    "bugtrack_url": null,
    "license": null,
    "summary": null,
    "version": "0.1.2",
    "project_urls": null,
    "split_keywords": [
        "machine-learning",
        " parameter optimization",
        " genetic algorithm",
        " simulated annealing"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f6c391654d25a5a63928273a3c356671c8237835be5fac3ad2cadb2f87350a42",
                "md5": "5c95064aa2a6d5c1c04a71e696acb914",
                "sha256": "650bf3a473880b11ef90df3e12f324c440f96490c252557a0ba4433a98932d69"
            },
            "downloads": -1,
            "filename": "jss_optimizer-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5c95064aa2a6d5c1c04a71e696acb914",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 4064,
            "upload_time": "2024-04-07T06:58:11",
            "upload_time_iso_8601": "2024-04-07T06:58:11.026386Z",
            "url": "https://files.pythonhosted.org/packages/f6/c3/91654d25a5a63928273a3c356671c8237835be5fac3ad2cadb2f87350a42/jss_optimizer-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2fbdcb5840e5c26311050ec295e021a109e0999649c37641718009b4501f0467",
                "md5": "fa9777a8a40f89da8d035cc9ed530978",
                "sha256": "8d38ebb23d569c6fb2b51d8ebb3e6337b5a655ce8ff72f1f48dea17d586955c4"
            },
            "downloads": -1,
            "filename": "jss_optimizer-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "fa9777a8a40f89da8d035cc9ed530978",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 6290,
            "upload_time": "2024-04-07T06:58:12",
            "upload_time_iso_8601": "2024-04-07T06:58:12.804044Z",
            "url": "https://files.pythonhosted.org/packages/2f/bd/cb5840e5c26311050ec295e021a109e0999649c37641718009b4501f0467/jss_optimizer-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-07 06:58:12",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "jss-optimizer"
}
        
Elapsed time: 0.25389s