[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)
# SimplifiedTransformers
The author presents an implementation for Simplifying Transformer Blocks. The standard transformer blocks are complex and can lead to architecture instability. In this work, the author investigates how the standard transformer block can be simplified. Through signal propagation theory and empirical observations, the author proposes modifications that remove several components without sacrificing training speed or performance. The simplified transformers achieve the same training speed and performance as standard transformers, while being 15% faster in training throughput and using 15% fewer parameters.
# Install
```
```
--------
## Usage
```python
import torch
from simplified_transformers.main import SimplifiedTransformers
model = SimplifiedTransformers(
dim=4096,
depth=6,
heads=8,
num_tokens=20000,
)
x = torch.randint(0, 20000, (1, 4096))
out = model(x)
print(out.shape)
```
# License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/kyegomez/SimplifiedTransformers",
"name": "simplified-transormer-torch",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9,<4.0",
"maintainer_email": "",
"keywords": "artificial intelligence,deep learning,optimizers,Prompt Engineering",
"author": "Kye Gomez",
"author_email": "kye@apac.ai",
"download_url": "https://files.pythonhosted.org/packages/89/3c/4be7eb175a4d44d1b2882b591b7edd3bb740a0fe4cdd8388fe653ef11988/simplified_transormer_torch-0.0.1.tar.gz",
"platform": null,
"description": "[![Multi-Modality](agorabanner.png)](https://discord.gg/qUtxnK2NMf)\n\n# SimplifiedTransformers\nThe author presents an implementation for Simplifying Transformer Blocks. The standard transformer blocks are complex and can lead to architecture instability. In this work, the author investigates how the standard transformer block can be simplified. Through signal propagation theory and empirical observations, the author proposes modifications that remove several components without sacrificing training speed or performance. The simplified transformers achieve the same training speed and performance as standard transformers, while being 15% faster in training throughput and using 15% fewer parameters.\n\n\n# Install\n```\n\n\n```\n\n--------\n\n## Usage\n```python\n\nimport torch\nfrom simplified_transformers.main import SimplifiedTransformers\n\nmodel = SimplifiedTransformers(\n dim=4096,\n depth=6,\n heads=8,\n num_tokens=20000,\n)\n\nx = torch.randint(0, 20000, (1, 4096))\n\nout = model(x)\nprint(out.shape)\n\n```\n\n\n\n\n\n\n# License\nMIT\n\n\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Paper - Pytorch",
"version": "0.0.1",
"project_urls": {
"Documentation": "https://github.com/kyegomez/SimplifiedTransformers",
"Homepage": "https://github.com/kyegomez/SimplifiedTransformers",
"Repository": "https://github.com/kyegomez/SimplifiedTransformers"
},
"split_keywords": [
"artificial intelligence",
"deep learning",
"optimizers",
"prompt engineering"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2f8c5dbb5ea310a1d01252d4580d2f7191c800e7278818f796a9473e9e666909",
"md5": "2e0cc7816743c8db3b4c5ba42758e960",
"sha256": "fca383ba1ad4ff9bf5484f234b71b5cc10ab940a8ced618aa5cd826985c52661"
},
"downloads": -1,
"filename": "simplified_transormer_torch-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2e0cc7816743c8db3b4c5ba42758e960",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9,<4.0",
"size": 4261,
"upload_time": "2023-12-02T19:26:35",
"upload_time_iso_8601": "2023-12-02T19:26:35.571207Z",
"url": "https://files.pythonhosted.org/packages/2f/8c/5dbb5ea310a1d01252d4580d2f7191c800e7278818f796a9473e9e666909/simplified_transormer_torch-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "893c4be7eb175a4d44d1b2882b591b7edd3bb740a0fe4cdd8388fe653ef11988",
"md5": "e659534b31995485168218637f95134e",
"sha256": "e77a019596081c1abec20c3227fd96822d631be8d444c7da8c9e6f6c494cca5c"
},
"downloads": -1,
"filename": "simplified_transormer_torch-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "e659534b31995485168218637f95134e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9,<4.0",
"size": 4205,
"upload_time": "2023-12-02T19:26:37",
"upload_time_iso_8601": "2023-12-02T19:26:37.546884Z",
"url": "https://files.pythonhosted.org/packages/89/3c/4be7eb175a4d44d1b2882b591b7edd3bb740a0fe4cdd8388fe653ef11988/simplified_transormer_torch-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-12-02 19:26:37",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "kyegomez",
"github_project": "SimplifiedTransformers",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [],
"lcname": "simplified-transormer-torch"
}