AlgoMaster


NameAlgoMaster JSON
Version 0.1.2 PyPI version JSON
download
home_page
SummaryThe Regression class simplifies regression analysis by providing a convenient and flexible approach for model training, evaluation, and hyperparameter tuning.The Classifier class streamlines classification tasks by offering a well-organized framework for model selection, hyperparameter tuning,
upload_time2023-06-30 11:50:54
maintainer
docs_urlNone
authorsajo sam
requires_python
license
keywords machine learning classifiers logistic regression k-nearest neighbors naive bayes random forests support vector machines ensemble methods hyperparameter tuning performance evaluation comparison multiple classifiersadvantages regression class regression analysis python model training evaluation hyperparameter tuning encapsulating regression algorithms metrics simplifies building comparing regression models accurate predictions insights
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# Project Title



Simplifying Regression and Classification Modeling



## Guide



### Installation setup



`pip install AlgoMaster`



### Classfication model



1.  Initialize the model



        `Classifier=AlgoMaster.Classifier(X,Y,test_size=0.2,random_state=20)`



2.  Train the model and predict the results in table format



        `Classifier.model_training()`



3.  Ensemble technique



        `Classifier.ensemble_prediction(No. of models)`



4.  Single Training

    To predict unseen data



        `data=[1,2,3,4,5,6,7,8,9]

        Classifier.logistic_test(data)

        Classifier.KNeighbors_test(data)

        Classifier.GaussianNB_test(data)

        Classifier.Bagging_test(data)

        Classifier.ExtraTrees_test(data)

        Classifier.RandomForest_test(data)

        Classifier.DecisionTree_test(data)

        Classifier.AdaBoost_test(data)

        Classifier.GradientBoosting_test(data)

        Classifier.XGBoost_test(data)

        Classifier.SGD_test(data)

        Classifier.SVC_test(data)

        Classifier.Ridge_test(data)

        Classifier.BernoulliNB_test(data)`



5.  Hyperparameter Turning

    To find the best parameters for the model



        `Classifier.hyperparameter_tuning()`



6.  Single Hyperparameter Turning

    To find the best parameters for the model



        `Classifier.logistic_hyperparameter()

        Classifier.KNeighbors_hyperparameter()

        Classifier.GaussianNB_hyperparameter()

        Classifier.Bagging_hyperparameter()

        Classifier.ExtraTrees_hyperparameter()

        Classifier.RandomForest_hyperparameter()

        Classifier.DecisionTree_hyperparameter()

        Classifier.AdaBoost_hyperparameter()

        Classifier.GradientBoosting_hyperparameter()

        Classifier.XGBoost_hyperparameter()

        Classifier.SGD_hyperparameter()

        Classifier.SVC_hyperparameter()

        Classifier.Ridge_hyperparameter()

        Classifier.BernoulliNB_hyperparameter()`



### Regression model



1.  Initialize the model



        `Regressor=AlgoMaster.Regressor(X,Y,test_size=0.2,random_state=20)`



2.  Train the model and predict the results in table format



        `Regressor.model_training()`



3.  Ensemble technique



        `Regressor.ensemble_prediction(No. of models)`



4.  Single Training



        `data=[1,2,3,4,5,6,7,8,9]

        Regressor.LinearRegression_test(data)

        Regressor.KNeighbors_test(data)

        Regressor.Bagging_test(data)

        Regressor.ExtraTrees_test(data)

        Regressor.RandomForest_test(data)

        Regressor.DecisionTree_test(data)

        Regressor.AdaBoost_test(data)

        Regressor.GradientBoosting_test(data)

        Regressor.XGBoost_test(data)

        Regressor.TheilSen_test(data)

        Regressor.SVR_test(data)

        Regressor.Ridge_test(data)

        Regressor.RANSAC_test(data)

        Regressor.ARD_test(data)

        Regressor.BayesianRidge_test(data)

        Regressor.HuberRegressor_test(data)

        Regressor.Lasso_test(data)

        Regressor.ElasticNet_test(data)`



5.  Hyperparameter Turning

    To find the best parameters for the model



        `Regressor.hyperparameter_tuning()`



6.  Single Hyperparameter Turning

    To find the best parameters for the model



        `Regressor.KNeighbors_hyperparameter()

        Regressor.Bagging_hyperparameter()

        Regressor.ExtraTrees_hyperparameter()

        Regressor.RandomForest_hyperparameter()

        Regressor.DecisionTree_hyperparameter()

        Regressor.AdaBoost_hyperparameter()

        Regressor.GradientBoosting_hyperparameter()

        Regressor.XGBoost_hyperparameter()

        Regressor.TheilSen_hyperparameter()

        <!-- Regressor.SVR_hyperparameter() -->

        Regressor.Ridge_hyperparameter()

        Regressor.RANSAC_hyperparameter()

        Regressor.ARD_hyperparameter()

        Regressor.BayesianRidge_hyperparameter()

        Regressor.Lasso_hyperparameter()

        Regressor.ElasticNet_hyperparameter()`


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "AlgoMaster",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "machine learning,classifiers,logistic regression,k-nearest neighbors,naive Bayes,random forests,support vector machines,ensemble methods,hyperparameter tuning,performance evaluation,comparison,multiple classifiersadvantages,Regression class,regression analysis, Python,model training,evaluation,hyperparameter tuning,encapsulating,regression algorithms,metrics,simplifies,building,comparing,regression models,accurate predictions,insights",
    "author": "sajo sam",
    "author_email": "<sajosamambalakara@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/ee/a2/e2305aab25e237d57cd426e8cb6e5dfb24d760bcafa12bd7e546adc50153/AlgoMaster-0.1.2.tar.gz",
    "platform": null,
    "description": "\r\n# Project Title\r\n\r\n\r\n\r\nSimplifying Regression and Classification Modeling\r\n\r\n\r\n\r\n## Guide\r\n\r\n\r\n\r\n### Installation setup\r\n\r\n\r\n\r\n`pip install AlgoMaster`\r\n\r\n\r\n\r\n### Classfication model\r\n\r\n\r\n\r\n1.  Initialize the model\r\n\r\n\r\n\r\n        `Classifier=AlgoMaster.Classifier(X,Y,test_size=0.2,random_state=20)`\r\n\r\n\r\n\r\n2.  Train the model and predict the results in table format\r\n\r\n\r\n\r\n        `Classifier.model_training()`\r\n\r\n\r\n\r\n3.  Ensemble technique\r\n\r\n\r\n\r\n        `Classifier.ensemble_prediction(No. of models)`\r\n\r\n\r\n\r\n4.  Single Training\r\n\r\n    To predict unseen data\r\n\r\n\r\n\r\n        `data=[1,2,3,4,5,6,7,8,9]\r\n\r\n        Classifier.logistic_test(data)\r\n\r\n        Classifier.KNeighbors_test(data)\r\n\r\n        Classifier.GaussianNB_test(data)\r\n\r\n        Classifier.Bagging_test(data)\r\n\r\n        Classifier.ExtraTrees_test(data)\r\n\r\n        Classifier.RandomForest_test(data)\r\n\r\n        Classifier.DecisionTree_test(data)\r\n\r\n        Classifier.AdaBoost_test(data)\r\n\r\n        Classifier.GradientBoosting_test(data)\r\n\r\n        Classifier.XGBoost_test(data)\r\n\r\n        Classifier.SGD_test(data)\r\n\r\n        Classifier.SVC_test(data)\r\n\r\n        Classifier.Ridge_test(data)\r\n\r\n        Classifier.BernoulliNB_test(data)`\r\n\r\n\r\n\r\n5.  Hyperparameter Turning\r\n\r\n    To find the best parameters for the model\r\n\r\n\r\n\r\n        `Classifier.hyperparameter_tuning()`\r\n\r\n\r\n\r\n6.  Single Hyperparameter Turning\r\n\r\n    To find the best parameters for the model\r\n\r\n\r\n\r\n        `Classifier.logistic_hyperparameter()\r\n\r\n        Classifier.KNeighbors_hyperparameter()\r\n\r\n        Classifier.GaussianNB_hyperparameter()\r\n\r\n        Classifier.Bagging_hyperparameter()\r\n\r\n        Classifier.ExtraTrees_hyperparameter()\r\n\r\n        Classifier.RandomForest_hyperparameter()\r\n\r\n        Classifier.DecisionTree_hyperparameter()\r\n\r\n        Classifier.AdaBoost_hyperparameter()\r\n\r\n        Classifier.GradientBoosting_hyperparameter()\r\n\r\n        Classifier.XGBoost_hyperparameter()\r\n\r\n        Classifier.SGD_hyperparameter()\r\n\r\n        Classifier.SVC_hyperparameter()\r\n\r\n        Classifier.Ridge_hyperparameter()\r\n\r\n        Classifier.BernoulliNB_hyperparameter()`\r\n\r\n\r\n\r\n### Regression model\r\n\r\n\r\n\r\n1.  Initialize the model\r\n\r\n\r\n\r\n        `Regressor=AlgoMaster.Regressor(X,Y,test_size=0.2,random_state=20)`\r\n\r\n\r\n\r\n2.  Train the model and predict the results in table format\r\n\r\n\r\n\r\n        `Regressor.model_training()`\r\n\r\n\r\n\r\n3.  Ensemble technique\r\n\r\n\r\n\r\n        `Regressor.ensemble_prediction(No. of models)`\r\n\r\n\r\n\r\n4.  Single Training\r\n\r\n\r\n\r\n        `data=[1,2,3,4,5,6,7,8,9]\r\n\r\n        Regressor.LinearRegression_test(data)\r\n\r\n        Regressor.KNeighbors_test(data)\r\n\r\n        Regressor.Bagging_test(data)\r\n\r\n        Regressor.ExtraTrees_test(data)\r\n\r\n        Regressor.RandomForest_test(data)\r\n\r\n        Regressor.DecisionTree_test(data)\r\n\r\n        Regressor.AdaBoost_test(data)\r\n\r\n        Regressor.GradientBoosting_test(data)\r\n\r\n        Regressor.XGBoost_test(data)\r\n\r\n        Regressor.TheilSen_test(data)\r\n\r\n        Regressor.SVR_test(data)\r\n\r\n        Regressor.Ridge_test(data)\r\n\r\n        Regressor.RANSAC_test(data)\r\n\r\n        Regressor.ARD_test(data)\r\n\r\n        Regressor.BayesianRidge_test(data)\r\n\r\n        Regressor.HuberRegressor_test(data)\r\n\r\n        Regressor.Lasso_test(data)\r\n\r\n        Regressor.ElasticNet_test(data)`\r\n\r\n\r\n\r\n5.  Hyperparameter Turning\r\n\r\n    To find the best parameters for the model\r\n\r\n\r\n\r\n        `Regressor.hyperparameter_tuning()`\r\n\r\n\r\n\r\n6.  Single Hyperparameter Turning\r\n\r\n    To find the best parameters for the model\r\n\r\n\r\n\r\n        `Regressor.KNeighbors_hyperparameter()\r\n\r\n        Regressor.Bagging_hyperparameter()\r\n\r\n        Regressor.ExtraTrees_hyperparameter()\r\n\r\n        Regressor.RandomForest_hyperparameter()\r\n\r\n        Regressor.DecisionTree_hyperparameter()\r\n\r\n        Regressor.AdaBoost_hyperparameter()\r\n\r\n        Regressor.GradientBoosting_hyperparameter()\r\n\r\n        Regressor.XGBoost_hyperparameter()\r\n\r\n        Regressor.TheilSen_hyperparameter()\r\n\r\n        <!-- Regressor.SVR_hyperparameter() -->\r\n\r\n        Regressor.Ridge_hyperparameter()\r\n\r\n        Regressor.RANSAC_hyperparameter()\r\n\r\n        Regressor.ARD_hyperparameter()\r\n\r\n        Regressor.BayesianRidge_hyperparameter()\r\n\r\n        Regressor.Lasso_hyperparameter()\r\n\r\n        Regressor.ElasticNet_hyperparameter()`\r\n\r\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "The Regression class simplifies regression analysis by providing a convenient and flexible approach for model training, evaluation, and hyperparameter tuning.The Classifier class streamlines classification tasks by offering a well-organized framework for model selection, hyperparameter tuning,",
    "version": "0.1.2",
    "project_urls": null,
    "split_keywords": [
        "machine learning",
        "classifiers",
        "logistic regression",
        "k-nearest neighbors",
        "naive bayes",
        "random forests",
        "support vector machines",
        "ensemble methods",
        "hyperparameter tuning",
        "performance evaluation",
        "comparison",
        "multiple classifiersadvantages",
        "regression class",
        "regression analysis",
        " python",
        "model training",
        "evaluation",
        "hyperparameter tuning",
        "encapsulating",
        "regression algorithms",
        "metrics",
        "simplifies",
        "building",
        "comparing",
        "regression models",
        "accurate predictions",
        "insights"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "672042cb21f29b463081f5ae13ea7720a545f6dfd2386dec7f319ac5012a4b51",
                "md5": "a81f663abb262e174dadd4e03daa66e1",
                "sha256": "2a8c07599fd28dcb2ae7dd07b5b878d59a3475707413e82e281aa81a03423a4d"
            },
            "downloads": -1,
            "filename": "AlgoMaster-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a81f663abb262e174dadd4e03daa66e1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 13639,
            "upload_time": "2023-06-30T11:50:52",
            "upload_time_iso_8601": "2023-06-30T11:50:52.444840Z",
            "url": "https://files.pythonhosted.org/packages/67/20/42cb21f29b463081f5ae13ea7720a545f6dfd2386dec7f319ac5012a4b51/AlgoMaster-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eea2e2305aab25e237d57cd426e8cb6e5dfb24d760bcafa12bd7e546adc50153",
                "md5": "33a1b60b0be978d052f65a49d111e206",
                "sha256": "b89f5dbd56132060777966e96fd528cff4d61c074615f72c02f1e4942ecca762"
            },
            "downloads": -1,
            "filename": "AlgoMaster-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "33a1b60b0be978d052f65a49d111e206",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 12857,
            "upload_time": "2023-06-30T11:50:54",
            "upload_time_iso_8601": "2023-06-30T11:50:54.820892Z",
            "url": "https://files.pythonhosted.org/packages/ee/a2/e2305aab25e237d57cd426e8cb6e5dfb24d760bcafa12bd7e546adc50153/AlgoMaster-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-30 11:50:54",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "algomaster"
}
        
Elapsed time: 1.84604s