monotonic-attention


Namemonotonic-attention JSON
Version 0.0.4 PyPI version JSON
download
home_pagehttps://github.com/codekansas/monotonic-attention
SummaryMonotonic attention implementation
upload_time2023-11-12 20:35:40
maintainer
docs_urlNone
authorBenjamin Bolte
requires_python>=3.10
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # monotonic-attention

Monotonic attention as a probabilistic graphical model

[Write-up explaining how this works](https://ben.bolte.cc/monotonic-attention)

Check out the `examples/` directory for more information.

## Getting Started

Install from PyPi:

```bash
pip install monotonic-attention
```

Install from source:

```bash
pip install git+https://github.com/codekansas/monotonic-attention.git
```

You should also install Triton if you plan to use the GPU kernels (highly recommended):

```bash
pip install triton
```

## Usage

```python
from monotonic_attention import OneToManyMultiheadMonotonicAttention

# Many keys mapped to a single query.
attn = OneToManyMultiheadMonotonicAttention(
  mode="many_keys_one_query",
  embed_dim=1024,
  num_heads=16,
)

output = attn(query, key, value)

# Many queries mapped to a single key.
attn = OneToManyMultiheadMonotonicAttention(
  mode="many_queries_one_key",
  embed_dim=1024,
  num_heads=16,
)

output = attn(query, key, value)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/codekansas/monotonic-attention",
    "name": "monotonic-attention",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10",
    "maintainer_email": "",
    "keywords": "",
    "author": "Benjamin Bolte",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/11/24/7f8648db9fbedf36a021e58a9214531d7f5e7edca52b3e42452dde23b259/monotonic-attention-0.0.4.tar.gz",
    "platform": null,
    "description": "# monotonic-attention\n\nMonotonic attention as a probabilistic graphical model\n\n[Write-up explaining how this works](https://ben.bolte.cc/monotonic-attention)\n\nCheck out the `examples/` directory for more information.\n\n## Getting Started\n\nInstall from PyPi:\n\n```bash\npip install monotonic-attention\n```\n\nInstall from source:\n\n```bash\npip install git+https://github.com/codekansas/monotonic-attention.git\n```\n\nYou should also install Triton if you plan to use the GPU kernels (highly recommended):\n\n```bash\npip install triton\n```\n\n## Usage\n\n```python\nfrom monotonic_attention import OneToManyMultiheadMonotonicAttention\n\n# Many keys mapped to a single query.\nattn = OneToManyMultiheadMonotonicAttention(\n  mode=\"many_keys_one_query\",\n  embed_dim=1024,\n  num_heads=16,\n)\n\noutput = attn(query, key, value)\n\n# Many queries mapped to a single key.\nattn = OneToManyMultiheadMonotonicAttention(\n  mode=\"many_queries_one_key\",\n  embed_dim=1024,\n  num_heads=16,\n)\n\noutput = attn(query, key, value)\n```\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Monotonic attention implementation",
    "version": "0.0.4",
    "project_urls": {
        "Homepage": "https://github.com/codekansas/monotonic-attention"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7d736b520a9e413aca23551b0b0d683177a2fbe8db23ec8585b3e46898662ab6",
                "md5": "1662fbf3ee3e92489941fd183b167ae9",
                "sha256": "4ff44c3e976c2cd8d3c6a25277a2652905b65c364c779bc9b4792849a4ce86fb"
            },
            "downloads": -1,
            "filename": "monotonic_attention-0.0.4-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1662fbf3ee3e92489941fd183b167ae9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10",
            "size": 17816,
            "upload_time": "2023-11-12T20:35:37",
            "upload_time_iso_8601": "2023-11-12T20:35:37.203507Z",
            "url": "https://files.pythonhosted.org/packages/7d/73/6b520a9e413aca23551b0b0d683177a2fbe8db23ec8585b3e46898662ab6/monotonic_attention-0.0.4-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "11247f8648db9fbedf36a021e58a9214531d7f5e7edca52b3e42452dde23b259",
                "md5": "4955805e5a21b77c11db5a4d7bf30e63",
                "sha256": "9801b205c8dc5a5cab197f971e58c9ea96bf738d1f442f51203753cdf077b915"
            },
            "downloads": -1,
            "filename": "monotonic-attention-0.0.4.tar.gz",
            "has_sig": false,
            "md5_digest": "4955805e5a21b77c11db5a4d7bf30e63",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10",
            "size": 14627,
            "upload_time": "2023-11-12T20:35:40",
            "upload_time_iso_8601": "2023-11-12T20:35:40.905049Z",
            "url": "https://files.pythonhosted.org/packages/11/24/7f8648db9fbedf36a021e58a9214531d7f5e7edca52b3e42452dde23b259/monotonic-attention-0.0.4.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-11-12 20:35:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "codekansas",
    "github_project": "monotonic-attention",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "monotonic-attention"
}
        
Elapsed time: 0.14579s