xai-feature-selection


Namexai-feature-selection JSON
Version 0.6 PyPI version JSON
download
home_pageNone
SummaryFeature selection using XAI
upload_time2024-07-13 20:08:15
maintainerNone
docs_urlNone
authorYaganteeswarudu Akkem
requires_pythonNone
licenseNone
keywords machine learning feature selection explainable artificial intelligence xai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
**Advanced feature selection using explainable Artificial Intelligence (XAI)** 

**Developed by Yaganteeswarudu Akkem , Data scientist , Ph.D. Scholar , NIT Silchar**



**Introduction**

<p style='text-align: justify;'> 

In the rapidly evolving field of machine learning, the complexity of models is ever-increasing, necessitating sophisticated feature selection techniques to enhance predictive performance and shed light on the decision-making processes. This study presents an innovative architecture that synergizes the global explanation capabilities of SHAP (SHapley Additive exPlanations) with the local interpretability provided by LIME (Local Interpretable Model-agnostic Explanations) to advance the feature selection process. 

</p>

<p style='text-align: justify;'> 

Our proposed methodology harnesses the strengths of both SHAP and LIME, systematically identifying features that wield consistent influence across the entire dataset as well as those vital to individual predictions. By normalizing SHAP values to derive feature weights and integrating these with LIME scores, we formulate a maximum interpretation score for each feature. This hybrid framework offers a refined and nuanced approach to feature selection, adeptly balancing the pursuit of model simplicity with the demands for high predictive accuracy and interpretability. The architecture not only promises substantial enhancements in computational efficiency and model performance but also holds significant promise for applications where model transparency and decision-making understanding are critical.

</p>

**Examples of How To Use Feature selection**

Install package by using below syntax 

pip install xai-feature-selection==0.4





Consume package by using below syntax 

from xai_feature_selection.feature_selection import FeatureSelect

from xai_feature_selection.model_prediction import Model





Currently xai_feature_selection built to work for classification and regression problems 



Use below algorithms to test regession  

 1. LinearRegression

 2. RandomForestRegressor

Use below algorithm for  classification 

 1. LogisticRegression



Below is the syntax to retrieve best features after calculating feature importance 



file_path: location of csv file in your system 



predict_columns : in classification or regression , column which is going to be predicted 



model_type_choice  : 



```

0 - Regression  ,  1 - Classification



```

model_choice  :



```

               For regression 

               0 - LinearRegression

               1 - RandomForestRegressor

               for classification 

               0 - LogisticRegression



```



Once all parameters choosen , simply use below syntax to call Model , to calculate LIME and SHAP values and finally Feature select method will return important features



```

                if predict_columns and file_path:

                    model = Model(

                        model_type=model_type_choice,

                        model_choice=model_choice,

                        data_file_path=file_path,

                        predict_columns=predict_columns,

                    )

                    model.train()

                    lime_data, shap_data = model.explain()

                    feature_handler = FeatureSelect(

                        shap_data=shap_data, lime_data=lime_data

                    )

                    feature_handler.prepare_weights()

                    feature_handler.calculate_feature_values()

                    feature_handler.get_best_feature_data()

                    print(feature_handler.get_best_feature_data())



```



### Note : 

Its very important if you pass more appropriate pre-processed data ( without null values , outliers and so on ) , you will expect more better features from algorithm 
















            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "xai-feature-selection",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "machine learning, feature selection, explainable artificial intelligence, XAI",
    "author": "Yaganteeswarudu Akkem",
    "author_email": "yaganteeswaritexpert@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/7d/23/100495bdf698e35bd8f1d6e011e676a9b3d2d1055fa3235cfe4387c9fe8c/xai_feature_selection-0.6.tar.gz",
    "platform": null,
    "description": "\r\n**Advanced feature selection using explainable Artificial Intelligence (XAI)** \r\n\r\n**Developed by Yaganteeswarudu Akkem , Data scientist , Ph.D. Scholar , NIT Silchar**\r\n\r\n\r\n\r\n**Introduction**\r\n\r\n<p style='text-align: justify;'> \r\n\r\nIn the rapidly evolving field of machine learning, the complexity of models is ever-increasing, necessitating sophisticated feature selection techniques to enhance predictive performance and shed light on the decision-making processes. This study presents an innovative architecture that synergizes the global explanation capabilities of SHAP (SHapley Additive exPlanations) with the local interpretability provided by LIME (Local Interpretable Model-agnostic Explanations) to advance the feature selection process. \r\n\r\n</p>\r\n\r\n<p style='text-align: justify;'> \r\n\r\nOur proposed methodology harnesses the strengths of both SHAP and LIME, systematically identifying features that wield consistent influence across the entire dataset as well as those vital to individual predictions. By normalizing SHAP values to derive feature weights and integrating these with LIME scores, we formulate a maximum interpretation score for each feature. This hybrid framework offers a refined and nuanced approach to feature selection, adeptly balancing the pursuit of model simplicity with the demands for high predictive accuracy and interpretability. The architecture not only promises substantial enhancements in computational efficiency and model performance but also holds significant promise for applications where model transparency and decision-making understanding are critical.\r\n\r\n</p>\r\n\r\n**Examples of How To Use Feature selection**\r\n\r\nInstall package by using below syntax \r\n\r\npip install xai-feature-selection==0.4\r\n\r\n\r\n\r\n\r\n\r\nConsume package by using below syntax \r\n\r\nfrom xai_feature_selection.feature_selection import FeatureSelect\r\n\r\nfrom xai_feature_selection.model_prediction import Model\r\n\r\n\r\n\r\n\r\n\r\nCurrently xai_feature_selection built to work for classification and regression problems \r\n\r\n\r\n\r\nUse below algorithms to test regession  \r\n\r\n 1. LinearRegression\r\n\r\n 2. RandomForestRegressor\r\n\r\nUse below algorithm for  classification \r\n\r\n 1. LogisticRegression\r\n\r\n\r\n\r\nBelow is the syntax to retrieve best features after calculating feature importance \r\n\r\n\r\n\r\nfile_path: location of csv file in your system \r\n\r\n\r\n\r\npredict_columns : in classification or regression , column which is going to be predicted \r\n\r\n\r\n\r\nmodel_type_choice  : \r\n\r\n\r\n\r\n```\r\n\r\n0 - Regression  ,  1 - Classification\r\n\r\n\r\n\r\n```\r\n\r\nmodel_choice  :\r\n\r\n\r\n\r\n```\r\n\r\n               For regression \r\n\r\n               0 - LinearRegression\r\n\r\n               1 - RandomForestRegressor\r\n\r\n               for classification \r\n\r\n               0 - LogisticRegression\r\n\r\n\r\n\r\n```\r\n\r\n\r\n\r\nOnce all parameters choosen , simply use below syntax to call Model , to calculate LIME and SHAP values and finally Feature select method will return important features\r\n\r\n\r\n\r\n```\r\n\r\n                if predict_columns and file_path:\r\n\r\n                    model = Model(\r\n\r\n                        model_type=model_type_choice,\r\n\r\n                        model_choice=model_choice,\r\n\r\n                        data_file_path=file_path,\r\n\r\n                        predict_columns=predict_columns,\r\n\r\n                    )\r\n\r\n                    model.train()\r\n\r\n                    lime_data, shap_data = model.explain()\r\n\r\n                    feature_handler = FeatureSelect(\r\n\r\n                        shap_data=shap_data, lime_data=lime_data\r\n\r\n                    )\r\n\r\n                    feature_handler.prepare_weights()\r\n\r\n                    feature_handler.calculate_feature_values()\r\n\r\n                    feature_handler.get_best_feature_data()\r\n\r\n                    print(feature_handler.get_best_feature_data())\r\n\r\n\r\n\r\n```\r\n\r\n\r\n\r\n### Note : \r\n\r\nIts very important if you pass more appropriate pre-processed data ( without null values , outliers and so on ) , you will expect more better features from algorithm \r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Feature selection using XAI",
    "version": "0.6",
    "project_urls": null,
    "split_keywords": [
        "machine learning",
        " feature selection",
        " explainable artificial intelligence",
        " xai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "49433c88db32d53000068f58e821ae380ca95e00545b2ad5c4c4da802a5ce1fd",
                "md5": "4ccdfac6c16a8ee2b497ad9af7e75495",
                "sha256": "d200591f44101e359aefbcf0c56fea7db6db7156847d588da739a78e77c87744"
            },
            "downloads": -1,
            "filename": "xai_feature_selection-0.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4ccdfac6c16a8ee2b497ad9af7e75495",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 7098,
            "upload_time": "2024-07-13T20:08:13",
            "upload_time_iso_8601": "2024-07-13T20:08:13.546746Z",
            "url": "https://files.pythonhosted.org/packages/49/43/3c88db32d53000068f58e821ae380ca95e00545b2ad5c4c4da802a5ce1fd/xai_feature_selection-0.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7d23100495bdf698e35bd8f1d6e011e676a9b3d2d1055fa3235cfe4387c9fe8c",
                "md5": "91364fe7bc1228c088556d795d2d9afb",
                "sha256": "4ee5df2a1cb92c7f0c381ebf7c28e84af21378b8fba8ea4c5ea1390237f102bc"
            },
            "downloads": -1,
            "filename": "xai_feature_selection-0.6.tar.gz",
            "has_sig": false,
            "md5_digest": "91364fe7bc1228c088556d795d2d9afb",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 7018,
            "upload_time": "2024-07-13T20:08:15",
            "upload_time_iso_8601": "2024-07-13T20:08:15.108631Z",
            "url": "https://files.pythonhosted.org/packages/7d/23/100495bdf698e35bd8f1d6e011e676a9b3d2d1055fa3235cfe4387c9fe8c/xai_feature_selection-0.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-13 20:08:15",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "xai-feature-selection"
}
        
Elapsed time: 0.37627s