alphagenome-pytorch


Namealphagenome-pytorch JSON
Version 0.0.40 PyPI version JSON
download
home_pageNone
SummaryAlphaGenome
upload_time2025-07-08 19:32:08
maintainerNone
docs_urlNone
authorNone
requires_python>=3.9
licenseMIT License Copyright (c) 2025 Phil Wang Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
keywords artificial intelligence attention mechanism deep learning genomics splicing transformers
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <img src="./extended-figure-1.png" width="450px"></img>

## AlphaGenome (wip)

Implementation of [AlphaGenome](https://deepmind.google/discover/blog/alphagenome-ai-for-better-understanding-the-genome/), Deepmind's updated genomic attention model


## Appreciation

- [Miquel Anglada-Girotto](https://github.com/MiqG) for contributing the organism, output embedding, loss functions, and all the splicing prediction heads!

## Install

```bash
$ pip install alphagenome-pytorch
```

## Usage

The main unet transformer, without any heads

```python
import torch
from alphagenome_pytorch import AlphaGenome

model = AlphaGenome()

dna = torch.randint(0, 5, (2, 8192))

# organism_index - 0 for human, 1 for mouse - can be changed with `num_organisms` on `AlphaGenome`

embeds_1bp, embeds_128bp, embeds_pair = model(dna, organism_index = 0) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)
```

Adding all types of output heads (thanks to [@MiqG](https://github.com/MiqG))

```python
import torch
from alphagenome_pytorch import AlphaGenome, publication_heads_config

model = AlphaGenome()

model.add_heads(
    'human',
    num_tracks_1bp = 10,
    num_tracks_128bp = 10,
    num_tracks_contacts = 128,
    num_splicing_contexts = 64, # 2 strands x num. CURIE conditions
)

dna = torch.randint(0, 5, (2, 8192))

organism_index = torch.tensor([0, 1]) # the organism that each sequence belongs to
splice_donor_idx = torch.tensor([[10, 100, 34], [24, 546, 870]])
splice_acceptor_idx = torch.tensor([[15, 103, 87], [56, 653, 900]])

# get sequence embeddings

embeddings_1bp, embeddings_128bp, embeddings_pair = model(dna, organism_index, return_embeds = True) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)

# get track predictions

out = model(
    dna,
    organism_index,
    splice_donor_idx = splice_donor_idx,
    splice_acceptor_idx = splice_acceptor_idx
)

for organism, outputs in out.items():
    for out_head, out_values in outputs.items():
        print(organism, out_head, out_values.shape)

# human 1bp_tracks torch.Size([2, 8192, 10])
# human 128bp_tracks torch.Size([2, 64, 10])
# human contact_head torch.Size([2, 4, 4, 128])
# human splice_logits torch.Size([2, 8192, 5])
# human splice_usage torch.Size([2, 8192, 64])
# human splice_juncs torch.Size([2, 3, 3, 64])

# initialize published AlphaGenome for human and mouse
model = AlphaGenome()
model.add_heads(**publication_heads_config['human'])
model.add_heads(**publication_heads_config['mouse'])
model.total_parameters # 259,459,534 (vs ~450 million trainable parameters)
```

## Training

### test minimal architecture
```shell
# loss quickly decreases and stabilizes at around 1349651
# this minimal model (576,444 parameters) can be run with cpu

python train_dummy.py --config_file=configs/dummy.yaml
```

## Contributing

First install locally with the following

```bash
$ pip install '.[test]' # or uv pip install . '[test]'
```

Then make your changes, add a test to `tests/test_alphagenome.py`

```bash
$ pytest tests
```

That's it

Vibe coding with some attention network is totally welcomed, if it works

## Citations

```bibtex
@article {avsec2025alphagenome,
    title = {AlphaGenome: advancing regulatory variant effect prediction with a unified DNA sequence model},
    author = {Avsec, {\v Z}iga and Latysheva, Natasha and Cheng, Jun and Novati, Guido and Taylor, Kyle R. and Ward, Tom and Bycroft, Clare and Nicolaisen, Lauren and Arvaniti, Eirini and Pan, Joshua and Thomas, Raina and Dutordoir, Vincent and Perino, Matteo and De, Soham and Karollus, Alexander and Gayoso, Adam and Sargeant, Toby and Mottram, Anne and Wong, Lai Hong and Drot{\'a}r, Pavol and Kosiorek, Adam and Senior, Andrew and Tanburn, Richard and Applebaum, Taylor and Basu, Souradeep and Hassabis, Demis and Kohli, Pushmeet},
    elocation-id = {2025.06.25.661532},
    year = {2025},
    doi = {10.1101/2025.06.25.661532},
    publisher = {Cold Spring Harbor Laboratory},
    URL = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532},
    eprint = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532.full.pdf},
    journal = {bioRxiv}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "alphagenome-pytorch",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": "artificial intelligence, attention mechanism, deep learning, genomics, splicing, transformers",
    "author": null,
    "author_email": "Phil Wang <lucidrains@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/45/2f/d8e16759483e2edf46962cc9d5295b0b4acb549b7bee2a9455d52039536b/alphagenome_pytorch-0.0.40.tar.gz",
    "platform": null,
    "description": "<img src=\"./extended-figure-1.png\" width=\"450px\"></img>\n\n## AlphaGenome (wip)\n\nImplementation of [AlphaGenome](https://deepmind.google/discover/blog/alphagenome-ai-for-better-understanding-the-genome/), Deepmind's updated genomic attention model\n\n\n## Appreciation\n\n- [Miquel Anglada-Girotto](https://github.com/MiqG) for contributing the organism, output embedding, loss functions, and all the splicing prediction heads!\n\n## Install\n\n```bash\n$ pip install alphagenome-pytorch\n```\n\n## Usage\n\nThe main unet transformer, without any heads\n\n```python\nimport torch\nfrom alphagenome_pytorch import AlphaGenome\n\nmodel = AlphaGenome()\n\ndna = torch.randint(0, 5, (2, 8192))\n\n# organism_index - 0 for human, 1 for mouse - can be changed with `num_organisms` on `AlphaGenome`\n\nembeds_1bp, embeds_128bp, embeds_pair = model(dna, organism_index = 0) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)\n```\n\nAdding all types of output heads (thanks to [@MiqG](https://github.com/MiqG))\n\n```python\nimport torch\nfrom alphagenome_pytorch import AlphaGenome, publication_heads_config\n\nmodel = AlphaGenome()\n\nmodel.add_heads(\n    'human',\n    num_tracks_1bp = 10,\n    num_tracks_128bp = 10,\n    num_tracks_contacts = 128,\n    num_splicing_contexts = 64, # 2 strands x num. CURIE conditions\n)\n\ndna = torch.randint(0, 5, (2, 8192))\n\norganism_index = torch.tensor([0, 1]) # the organism that each sequence belongs to\nsplice_donor_idx = torch.tensor([[10, 100, 34], [24, 546, 870]])\nsplice_acceptor_idx = torch.tensor([[15, 103, 87], [56, 653, 900]])\n\n# get sequence embeddings\n\nembeddings_1bp, embeddings_128bp, embeddings_pair = model(dna, organism_index, return_embeds = True) # (2, 8192, 1536), (2, 64, 3072), (2, 4, 4, 128)\n\n# get track predictions\n\nout = model(\n    dna,\n    organism_index,\n    splice_donor_idx = splice_donor_idx,\n    splice_acceptor_idx = splice_acceptor_idx\n)\n\nfor organism, outputs in out.items():\n    for out_head, out_values in outputs.items():\n        print(organism, out_head, out_values.shape)\n\n# human 1bp_tracks torch.Size([2, 8192, 10])\n# human 128bp_tracks torch.Size([2, 64, 10])\n# human contact_head torch.Size([2, 4, 4, 128])\n# human splice_logits torch.Size([2, 8192, 5])\n# human splice_usage torch.Size([2, 8192, 64])\n# human splice_juncs torch.Size([2, 3, 3, 64])\n\n# initialize published AlphaGenome for human and mouse\nmodel = AlphaGenome()\nmodel.add_heads(**publication_heads_config['human'])\nmodel.add_heads(**publication_heads_config['mouse'])\nmodel.total_parameters # 259,459,534 (vs ~450 million trainable parameters)\n```\n\n## Training\n\n### test minimal architecture\n```shell\n# loss quickly decreases and stabilizes at around 1349651\n# this minimal model (576,444 parameters) can be run with cpu\n\npython train_dummy.py --config_file=configs/dummy.yaml\n```\n\n## Contributing\n\nFirst install locally with the following\n\n```bash\n$ pip install '.[test]' # or uv pip install . '[test]'\n```\n\nThen make your changes, add a test to `tests/test_alphagenome.py`\n\n```bash\n$ pytest tests\n```\n\nThat's it\n\nVibe coding with some attention network is totally welcomed, if it works\n\n## Citations\n\n```bibtex\n@article {avsec2025alphagenome,\n    title = {AlphaGenome: advancing regulatory variant effect prediction with a unified DNA sequence model},\n    author = {Avsec, {\\v Z}iga and Latysheva, Natasha and Cheng, Jun and Novati, Guido and Taylor, Kyle R. and Ward, Tom and Bycroft, Clare and Nicolaisen, Lauren and Arvaniti, Eirini and Pan, Joshua and Thomas, Raina and Dutordoir, Vincent and Perino, Matteo and De, Soham and Karollus, Alexander and Gayoso, Adam and Sargeant, Toby and Mottram, Anne and Wong, Lai Hong and Drot{\\'a}r, Pavol and Kosiorek, Adam and Senior, Andrew and Tanburn, Richard and Applebaum, Taylor and Basu, Souradeep and Hassabis, Demis and Kohli, Pushmeet},\n    elocation-id = {2025.06.25.661532},\n    year = {2025},\n    doi = {10.1101/2025.06.25.661532},\n    publisher = {Cold Spring Harbor Laboratory},\n    URL = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532},\n    eprint = {https://www.biorxiv.org/content/early/2025/06/27/2025.06.25.661532.full.pdf},\n    journal = {bioRxiv}\n}\n```\n",
    "bugtrack_url": null,
    "license": "MIT License\n        \n        Copyright (c) 2025 Phil Wang\n        \n        Permission is hereby granted, free of charge, to any person obtaining a copy\n        of this software and associated documentation files (the \"Software\"), to deal\n        in the Software without restriction, including without limitation the rights\n        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n        copies of the Software, and to permit persons to whom the Software is\n        furnished to do so, subject to the following conditions:\n        \n        The above copyright notice and this permission notice shall be included in all\n        copies or substantial portions of the Software.\n        \n        THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n        SOFTWARE.",
    "summary": "AlphaGenome",
    "version": "0.0.40",
    "project_urls": {
        "Homepage": "https://pypi.org/project/alphagenome-pytorch/",
        "Repository": "https://github.com/lucidrains/alphagenome"
    },
    "split_keywords": [
        "artificial intelligence",
        " attention mechanism",
        " deep learning",
        " genomics",
        " splicing",
        " transformers"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "4c27dc9d648c92cbe2eb8ff34fc5a8717042caa4ad720becf759ce9f2674499e",
                "md5": "14c2517ed18a20fe68ce6c64717aa8f1",
                "sha256": "dbfd9a962007808b55dc3110efa429d03a91a8ef61591264db07b248a77ea9bb"
            },
            "downloads": -1,
            "filename": "alphagenome_pytorch-0.0.40-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "14c2517ed18a20fe68ce6c64717aa8f1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 16991,
            "upload_time": "2025-07-08T19:32:07",
            "upload_time_iso_8601": "2025-07-08T19:32:07.696697Z",
            "url": "https://files.pythonhosted.org/packages/4c/27/dc9d648c92cbe2eb8ff34fc5a8717042caa4ad720becf759ce9f2674499e/alphagenome_pytorch-0.0.40-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "452fd8e16759483e2edf46962cc9d5295b0b4acb549b7bee2a9455d52039536b",
                "md5": "65601f2f81ec602b571dc901aaba6732",
                "sha256": "5776ae1b8f04819ba1fc33a60158e2d857246d6f540f93c249a678bc2334ce60"
            },
            "downloads": -1,
            "filename": "alphagenome_pytorch-0.0.40.tar.gz",
            "has_sig": false,
            "md5_digest": "65601f2f81ec602b571dc901aaba6732",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 504910,
            "upload_time": "2025-07-08T19:32:08",
            "upload_time_iso_8601": "2025-07-08T19:32:08.914387Z",
            "url": "https://files.pythonhosted.org/packages/45/2f/d8e16759483e2edf46962cc9d5295b0b4acb549b7bee2a9455d52039536b/alphagenome_pytorch-0.0.40.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-08 19:32:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "lucidrains",
    "github_project": "alphagenome",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "alphagenome-pytorch"
}
        
Elapsed time: 0.41760s