explainable-attention


Nameexplainable-attention JSON
Version 0.0.2 PyPI version JSON
download
home_pageNone
SummaryImplementation of various tools for multi-head attention explainability from transformers.
upload_time2025-08-10 18:11:12
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # explainable-attention

Implementation of various tools for multi-head attention explainability from transformers.

## Self-Attention Attribution

[Hao, Yaru, et al. "Self-attention attribution: Interpreting information interactions inside transformer." Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 35. No. 14. 2021.](https://arxiv.org/abs/2004.11207)

```py
from explainable_attention.self_attention_attribution import compute

...

def objective(batch):
    x, y = batch
    y = model(x)
    loss = loss_fn(x, y)
    return loss

attribution = saa.compute(
    model.transformer_encoder.layers,
    objective,
    batch,
    integration_steps=20)
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "explainable-attention",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "\"David W. Ludwig II\" <davidludwigii@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/8c/d8/b1a3c53f22250b2ce15e64a93f0245c5b7a47074da7545b0277277fadf8a/explainable_attention-0.0.2.tar.gz",
    "platform": null,
    "description": "# explainable-attention\n\nImplementation of various tools for multi-head attention explainability from transformers.\n\n## Self-Attention Attribution\n\n[Hao, Yaru, et al. \"Self-attention attribution: Interpreting information interactions inside transformer.\" Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 35. No. 14. 2021.](https://arxiv.org/abs/2004.11207)\n\n```py\nfrom explainable_attention.self_attention_attribution import compute\n\n...\n\ndef objective(batch):\n    x, y = batch\n    y = model(x)\n    loss = loss_fn(x, y)\n    return loss\n\nattribution = saa.compute(\n    model.transformer_encoder.layers,\n    objective,\n    batch,\n    integration_steps=20)\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Implementation of various tools for multi-head attention explainability from transformers.",
    "version": "0.0.2",
    "project_urls": {
        "Bug Tracker": "https://github.com/DLii-Research/explainable-attention/issues",
        "Homepage": "https://github.com/DLii-Research/explainable-attention"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e14dae0b16c7c12a0e71962b4f7f3002262efda468d2fcc3160b8253a67bd252",
                "md5": "d7b095063e6d0d2a32b28c8ebe6b0427",
                "sha256": "88923d20e9891fd421a7dbee8924c77b1ec007889c2fba8aa6b4a4131d27a51f"
            },
            "downloads": -1,
            "filename": "explainable_attention-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d7b095063e6d0d2a32b28c8ebe6b0427",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 5417,
            "upload_time": "2025-08-10T18:11:11",
            "upload_time_iso_8601": "2025-08-10T18:11:11.292783Z",
            "url": "https://files.pythonhosted.org/packages/e1/4d/ae0b16c7c12a0e71962b4f7f3002262efda468d2fcc3160b8253a67bd252/explainable_attention-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8cd8b1a3c53f22250b2ce15e64a93f0245c5b7a47074da7545b0277277fadf8a",
                "md5": "62096d9ebf01500c3cabbd2aef1bf91a",
                "sha256": "25320056a6bfa486a73575fb61e4d8cec3bfe5598605529dd54e9428196ab501"
            },
            "downloads": -1,
            "filename": "explainable_attention-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "62096d9ebf01500c3cabbd2aef1bf91a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 4437,
            "upload_time": "2025-08-10T18:11:12",
            "upload_time_iso_8601": "2025-08-10T18:11:12.316441Z",
            "url": "https://files.pythonhosted.org/packages/8c/d8/b1a3c53f22250b2ce15e64a93f0245c5b7a47074da7545b0277277fadf8a/explainable_attention-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-10 18:11:12",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "DLii-Research",
    "github_project": "explainable-attention",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "explainable-attention"
}
        
Elapsed time: 0.71196s