# Quantum-XAI: Explainable Quantum Machine Learning Library
Quantum-XAI is a comprehensive, production-ready toolkit for explainable quantum machine learning. It provides a suite of quantum-native explainability methods, quantum neural network models, advanced visualizations, benchmarking tools, and research-grade features to interpret and analyze quantum neural network decisions.
---
## Features
- **Quantum Neural Network Models**
- Variational Quantum Classifier (VQC) implementation using PennyLane
- Supports multiple quantum data encodings: Angle, Amplitude, and IQP encoding
- **Explainability Methods**
- Quantum SHAP Explainer: SHAP-like sampling-based explanations
- Quantum Gradient Explainer: Gradient-based explanations using parameter-shift rule
- Quantum LIME Explainer: LIME-like local surrogate model explanations
- Quantum Perturbation Explainer: Feature occlusion based explanations
- **Visualization Tools**
- Feature importance bar charts
- Side-by-side explanation method comparisons
- Quantum circuit diagrams with explanation overlays
- Radar charts for quantum feature importance
- **Datasets & Utilities**
- Preprocessed quantum-ready datasets: Iris, Wine, Breast Cancer
- Dataset loaders with normalization and binary classification options
- **Benchmarking & Evaluation**
- Compare multiple explainers on test samples
- Compute explanation consistency and quality metrics
- Faithfulness, sparsity, stability, and top feature importance analysis
- **Research Extensions**
- Quantum Fisher Information matrix computation
- Quantum entanglement contribution analysis
- Quantum feature interaction analysis beyond classical correlations
- **Save/Load Functionality**
- Save trained models and explanations to JSON
- Load models and explanations from JSON for reproducibility
- **Complete Demo**
- End-to-end demonstration of training, explaining, visualizing, benchmarking, and reporting
---
## Installation
Ensure you have Python 3.7+ installed. Install required dependencies:
```bash
pip install pennylane numpy matplotlib seaborn scikit-learn pandas
```
---
## Usage
### Quick Start Demo
Run the complete demonstration with the Iris dataset:
```python
from quantum_xai import QuantumXAIDemo
demo = QuantumXAIDemo()
results = demo.run_complete_demo(dataset='iris', n_samples=80)
```
### Custom Model Training and Explanation
```python
from quantum_xai import QuantumNeuralNetwork, QuantumSHAPExplainer, QuantumGradientExplainer, QuantumXAIVisualizer
from sklearn.model_selection import train_test_split
from quantum_xai import QuantumDatasetLoader
# Load data
X, y, feature_names = QuantumDatasetLoader.load_iris_quantum(n_samples=100)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)
# Create and train model
model = QuantumNeuralNetwork(n_features=X.shape[1], n_qubits=4, n_layers=2)
model.train(X_train, y_train, epochs=100, lr=0.1)
# Create explainers
shap_explainer = QuantumSHAPExplainer(model, X_train)
gradient_explainer = QuantumGradientExplainer(model)
# Generate explanation for a sample
explanation = shap_explainer.explain(X_test, 0)
# Visualize explanation
visualizer = QuantumXAIVisualizer()
fig = visualizer.plot_feature_importance(explanation, feature_names)
fig.show()
```
---
## Research Applications
- Benchmark quantum vs classical explainability methods
- Analyze quantum Fisher information and entanglement effects
- Extend to other quantum platforms (Qiskit, Cirq)
- Develop advanced quantum-specific explanation metrics
- Apply to real quantum datasets in chemistry, finance, and more
---
## Project Structure
- `QuantumNeuralNetwork`: Variational quantum classifier model
- `QuantumExplainer` and subclasses: Explainability methods (SHAP, Gradient, LIME, Perturbation)
- `QuantumXAIVisualizer`: Visualization utilities
- `QuantumDatasetLoader`: Dataset loading and preprocessing
- `QuantumXAIBenchmark`: Benchmarking and evaluation tools
- `QuantumXAIDemo`: Complete demo and example workflows
- `save_model_and_explanations` / `load_model_and_explanations`: Persistence utilities
- `QuantumXAIResearch`: Advanced research features
---
## License
This project is open-source and available for research, publication, and industry use.
---
## Contact
For questions, contributions, or collaborations, please open an issue or pull request on the GitHub repository.
---
# Acknowledgments
This library builds upon PennyLane and scikit-learn, leveraging quantum computing and classical ML explainability techniques.
Raw data
{
"_id": null,
"home_page": "https://github.com/yourusername/quantum_xai",
"name": "quantum-xai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.7",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "your.email@example.com",
"download_url": "https://files.pythonhosted.org/packages/47/cc/0cf6414f3a044b79d39c09370e698613e13a4948e78f03fef61f47e107d8/quantum_xai-0.1.0.tar.gz",
"platform": null,
"description": "# Quantum-XAI: Explainable Quantum Machine Learning Library\r\n\r\nQuantum-XAI is a comprehensive, production-ready toolkit for explainable quantum machine learning. It provides a suite of quantum-native explainability methods, quantum neural network models, advanced visualizations, benchmarking tools, and research-grade features to interpret and analyze quantum neural network decisions.\r\n\r\n---\r\n\r\n## Features\r\n\r\n- **Quantum Neural Network Models**\r\n - Variational Quantum Classifier (VQC) implementation using PennyLane\r\n - Supports multiple quantum data encodings: Angle, Amplitude, and IQP encoding\r\n\r\n- **Explainability Methods**\r\n - Quantum SHAP Explainer: SHAP-like sampling-based explanations\r\n - Quantum Gradient Explainer: Gradient-based explanations using parameter-shift rule\r\n - Quantum LIME Explainer: LIME-like local surrogate model explanations\r\n - Quantum Perturbation Explainer: Feature occlusion based explanations\r\n\r\n- **Visualization Tools**\r\n - Feature importance bar charts\r\n - Side-by-side explanation method comparisons\r\n - Quantum circuit diagrams with explanation overlays\r\n - Radar charts for quantum feature importance\r\n\r\n- **Datasets & Utilities**\r\n - Preprocessed quantum-ready datasets: Iris, Wine, Breast Cancer\r\n - Dataset loaders with normalization and binary classification options\r\n\r\n- **Benchmarking & Evaluation**\r\n - Compare multiple explainers on test samples\r\n - Compute explanation consistency and quality metrics\r\n - Faithfulness, sparsity, stability, and top feature importance analysis\r\n\r\n- **Research Extensions**\r\n - Quantum Fisher Information matrix computation\r\n - Quantum entanglement contribution analysis\r\n - Quantum feature interaction analysis beyond classical correlations\r\n\r\n- **Save/Load Functionality**\r\n - Save trained models and explanations to JSON\r\n - Load models and explanations from JSON for reproducibility\r\n\r\n- **Complete Demo**\r\n - End-to-end demonstration of training, explaining, visualizing, benchmarking, and reporting\r\n\r\n---\r\n\r\n## Installation\r\n\r\nEnsure you have Python 3.7+ installed. Install required dependencies:\r\n\r\n```bash\r\npip install pennylane numpy matplotlib seaborn scikit-learn pandas\r\n```\r\n\r\n---\r\n\r\n## Usage\r\n\r\n### Quick Start Demo\r\n\r\nRun the complete demonstration with the Iris dataset:\r\n\r\n```python\r\nfrom quantum_xai import QuantumXAIDemo\r\n\r\ndemo = QuantumXAIDemo()\r\nresults = demo.run_complete_demo(dataset='iris', n_samples=80)\r\n```\r\n\r\n### Custom Model Training and Explanation\r\n\r\n```python\r\nfrom quantum_xai import QuantumNeuralNetwork, QuantumSHAPExplainer, QuantumGradientExplainer, QuantumXAIVisualizer\r\nfrom sklearn.model_selection import train_test_split\r\nfrom quantum_xai import QuantumDatasetLoader\r\n\r\n# Load data\r\nX, y, feature_names = QuantumDatasetLoader.load_iris_quantum(n_samples=100)\r\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)\r\n\r\n# Create and train model\r\nmodel = QuantumNeuralNetwork(n_features=X.shape[1], n_qubits=4, n_layers=2)\r\nmodel.train(X_train, y_train, epochs=100, lr=0.1)\r\n\r\n# Create explainers\r\nshap_explainer = QuantumSHAPExplainer(model, X_train)\r\ngradient_explainer = QuantumGradientExplainer(model)\r\n\r\n# Generate explanation for a sample\r\nexplanation = shap_explainer.explain(X_test, 0)\r\n\r\n# Visualize explanation\r\nvisualizer = QuantumXAIVisualizer()\r\nfig = visualizer.plot_feature_importance(explanation, feature_names)\r\nfig.show()\r\n```\r\n\r\n---\r\n\r\n## Research Applications\r\n\r\n- Benchmark quantum vs classical explainability methods\r\n- Analyze quantum Fisher information and entanglement effects\r\n- Extend to other quantum platforms (Qiskit, Cirq)\r\n- Develop advanced quantum-specific explanation metrics\r\n- Apply to real quantum datasets in chemistry, finance, and more\r\n\r\n---\r\n\r\n## Project Structure\r\n\r\n- `QuantumNeuralNetwork`: Variational quantum classifier model\r\n- `QuantumExplainer` and subclasses: Explainability methods (SHAP, Gradient, LIME, Perturbation)\r\n- `QuantumXAIVisualizer`: Visualization utilities\r\n- `QuantumDatasetLoader`: Dataset loading and preprocessing\r\n- `QuantumXAIBenchmark`: Benchmarking and evaluation tools\r\n- `QuantumXAIDemo`: Complete demo and example workflows\r\n- `save_model_and_explanations` / `load_model_and_explanations`: Persistence utilities\r\n- `QuantumXAIResearch`: Advanced research features\r\n\r\n---\r\n\r\n## License\r\n\r\nThis project is open-source and available for research, publication, and industry use.\r\n\r\n---\r\n\r\n## Contact\r\n\r\nFor questions, contributions, or collaborations, please open an issue or pull request on the GitHub repository.\r\n\r\n---\r\n\r\n# Acknowledgments\r\n\r\nThis library builds upon PennyLane and scikit-learn, leveraging quantum computing and classical ML explainability techniques.\r\n",
"bugtrack_url": null,
"license": null,
"summary": "Explainable Quantum Machine Learning Library",
"version": "0.1.0",
"project_urls": {
"Homepage": "https://github.com/yourusername/quantum_xai"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c8c63dd9870139d58f0bd7f680c44d98586cc39159dfcd302d1b8ee86c566cc3",
"md5": "d421c15219254569ee3ed92bf88498be",
"sha256": "027e2d9580f088a1ac127ee719d6445c61dca06db9b8e8b146eb06b6bb69f68f"
},
"downloads": -1,
"filename": "quantum_xai-0.1.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d421c15219254569ee3ed92bf88498be",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.7",
"size": 11423,
"upload_time": "2025-07-14T18:53:06",
"upload_time_iso_8601": "2025-07-14T18:53:06.930442Z",
"url": "https://files.pythonhosted.org/packages/c8/c6/3dd9870139d58f0bd7f680c44d98586cc39159dfcd302d1b8ee86c566cc3/quantum_xai-0.1.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "47cc0cf6414f3a044b79d39c09370e698613e13a4948e78f03fef61f47e107d8",
"md5": "a3a462418992bbc1728742564a130487",
"sha256": "cd0829a490862b28db3b31691be2fa22f16e0a7faf42b1d9543b3002764bfc8c"
},
"downloads": -1,
"filename": "quantum_xai-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "a3a462418992bbc1728742564a130487",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.7",
"size": 12975,
"upload_time": "2025-07-14T18:53:07",
"upload_time_iso_8601": "2025-07-14T18:53:07.955634Z",
"url": "https://files.pythonhosted.org/packages/47/cc/0cf6414f3a044b79d39c09370e698613e13a4948e78f03fef61f47e107d8/quantum_xai-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-14 18:53:07",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "yourusername",
"github_project": "quantum_xai",
"github_not_found": true,
"lcname": "quantum-xai"
}