[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)
# Complex Transformer
The open source implementation of the attention and transformer from "Building Blocks for a Complex-Valued Transformer Architecture" where they propose an an attention mechanism for complex valued signals or images such as MRI and remote sensing.
They present:
- complex valued scaled dot product attention
- complex valued layer normalization
- results show improved robustness to overfitting while maintaing performance wbhen compared to real valued transformer
## Install
`pip install ct`
# License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/kyegomez/ct",
"name": "complex-attn",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.6,<4.0",
"maintainer_email": "",
"keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
"author": "Kye Gomez",
"author_email": "kye@apac.ai",
"download_url": "https://files.pythonhosted.org/packages/64/56/604a95193596e88713ad297f482f4ce90f8c0ea1e22af83da562d500ddcf/complex_attn-0.0.3.tar.gz",
"platform": null,
"description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# Complex Transformer\nThe open source implementation of the attention and transformer from \"Building Blocks for a Complex-Valued Transformer Architecture\" where they propose an an attention mechanism for complex valued signals or images such as MRI and remote sensing.\n\nThey present:\n- complex valued scaled dot product attention\n- complex valued layer normalization\n- results show improved robustness to overfitting while maintaing performance wbhen compared to real valued transformer\n\n## Install\n`pip install ct`\n\n# License\nMIT\n\n\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "ct - Pytorch",
"version": "0.0.3",
"project_urls": {
"Homepage": "https://github.com/kyegomez/ct",
"Repository": "https://github.com/kyegomez/ct"
},
"split_keywords": [
"artificial intelligence",
"deep learning",
"optimizers",
"prompt engineering"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "5874a55ee4d08f251f363b9233ba94171296566c341d5e64b2f3558d367fe31d",
"md5": "6ed9e9efea063bf5f306df04a970098f",
"sha256": "b810d00c848ab76e61ade9b225727c3bdf833555599a4f0e5d42a9d107c61008"
},
"downloads": -1,
"filename": "complex_attn-0.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6ed9e9efea063bf5f306df04a970098f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6,<4.0",
"size": 4194,
"upload_time": "2023-10-07T15:56:30",
"upload_time_iso_8601": "2023-10-07T15:56:30.804621Z",
"url": "https://files.pythonhosted.org/packages/58/74/a55ee4d08f251f363b9233ba94171296566c341d5e64b2f3558d367fe31d/complex_attn-0.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6456604a95193596e88713ad297f482f4ce90f8c0ea1e22af83da562d500ddcf",
"md5": "77dfad3c49bdc76553214df1cb588ba6",
"sha256": "309d55fd87dfccbe6dcb25fb186fad46cc5a02f9f732da463e2434dde4e853a6"
},
"downloads": -1,
"filename": "complex_attn-0.0.3.tar.gz",
"has_sig": false,
"md5_digest": "77dfad3c49bdc76553214df1cb588ba6",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6,<4.0",
"size": 4219,
"upload_time": "2023-10-07T15:56:32",
"upload_time_iso_8601": "2023-10-07T15:56:32.696082Z",
"url": "https://files.pythonhosted.org/packages/64/56/604a95193596e88713ad297f482f4ce90f8c0ea1e22af83da562d500ddcf/complex_attn-0.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-10-07 15:56:32",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kyegomez",
"github_project": "ct",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "torch",
"specs": []
},
{
"name": "einops",
"specs": []
}
],
"lcname": "complex-attn"
}