FJFormer


NameFJFormer JSON
Version 0.0.66 PyPI version JSON
download
home_pageNone
SummaryEmbark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!
upload_time2024-06-12 13:32:40
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseApache-2.0
keywords jax torch deep learning machine learning flax xla
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
  <img src="logo/light-logo.png" alt="Alt text"/>
</p>

# FJFormer

Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax
Functions and Utils that elevate your AI endeavors to new heights!

## Overview

FJFormer is a collection of functions and utilities that can help with various tasks when using Flax and JAX. It
includes
checkpoint savers, partitioning tools, and other helpful functions.
The goal of FJFormer is to make your life easier when working with Flax and JAX. Whether you are training a new model,
fine-tuning an existing one, or just exploring the capabilities of these powerful frameworks, FJFormer offers

- FlashAttention on `TPU/GPU` 🧬
- BITComputations for 8,6,4 BIT Flax Models 🤏
- Smart Dataset Loading
- Built-in functions and Loss functions
- GPU-Pallas triton like implementation of `Softmax`, `FlashAttention`, `RMSNorm`, `LayerNorm`
- Distributed and sharding Model Loaders and Checkpoint Savers
- Monitoring Utils for *TPU/GPU/CPU* memory `foot-print`
- Special Optimizers with schedulers and Easy to Use
- Partitioning Utils
- LoRA with `XRapture` 🤠

and A lot of these features are fully documented so i gusse FJFormer has something
to offer, and it's not just a Computation BackEnd for [EasyDel](https://github.com/erfanzar/EasyDel).

checkout for documentations [here](https://erfanzar.github.io/FJFormer/).

## Contributing

FJFormer is an open-source project, and contributions are always welcome! If you have a feature request, bug report, or
just want to help out with development, please check out our GitHub repository and feel free to submit a pull request or
open an issue.

Thank you for using FJFormer, and happy training!


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "FJFormer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "JAX, Torch, Deep Learning, Machine Learning, Flax, XLA",
    "author": null,
    "author_email": "Erfan Zare Chavoshi <Erfanzare810@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/3a/bb/6b2784968ad66a894ea6b9c7669d2cca811a652c037839eea9766f7e741f/fjformer-0.0.66.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n  <img src=\"logo/light-logo.png\" alt=\"Alt text\"/>\n</p>\n\n# FJFormer\n\nEmbark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax\nFunctions and Utils that elevate your AI endeavors to new heights!\n\n## Overview\n\nFJFormer is a collection of functions and utilities that can help with various tasks when using Flax and JAX. It\nincludes\ncheckpoint savers, partitioning tools, and other helpful functions.\nThe goal of FJFormer is to make your life easier when working with Flax and JAX. Whether you are training a new model,\nfine-tuning an existing one, or just exploring the capabilities of these powerful frameworks, FJFormer offers\n\n- FlashAttention on `TPU/GPU` \ud83e\uddec\n- BITComputations for 8,6,4 BIT Flax Models \ud83e\udd0f\n- Smart Dataset Loading\n- Built-in functions and Loss functions\n- GPU-Pallas triton like implementation of `Softmax`, `FlashAttention`, `RMSNorm`, `LayerNorm`\n- Distributed and sharding Model Loaders and Checkpoint Savers\n- Monitoring Utils for *TPU/GPU/CPU* memory `foot-print`\n- Special Optimizers with schedulers and Easy to Use\n- Partitioning Utils\n- LoRA with `XRapture` \ud83e\udd20\n\nand A lot of these features are fully documented so i gusse FJFormer has something\nto offer, and it's not just a Computation BackEnd for [EasyDel](https://github.com/erfanzar/EasyDel).\n\ncheckout for documentations [here](https://erfanzar.github.io/FJFormer/).\n\n## Contributing\n\nFJFormer is an open-source project, and contributions are always welcome! If you have a feature request, bug report, or\njust want to help out with development, please check out our GitHub repository and feel free to submit a pull request or\nopen an issue.\n\nThank you for using FJFormer, and happy training!\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!",
    "version": "0.0.66",
    "project_urls": null,
    "split_keywords": [
        "jax",
        " torch",
        " deep learning",
        " machine learning",
        " flax",
        " xla"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "25cd4cd9b5c97e9e45d104184db90577c729c698cdf0485585f791c4387a6ff7",
                "md5": "47ef46317f0125cc7ac5a92b22f0a606",
                "sha256": "db7d3326b44b9aa1a0bcc2ad561a0dc81bc0b58fcef58aa4692fcd8baaa55cbd"
            },
            "downloads": -1,
            "filename": "FJFormer-0.0.66-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "47ef46317f0125cc7ac5a92b22f0a606",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 173963,
            "upload_time": "2024-06-12T13:31:18",
            "upload_time_iso_8601": "2024-06-12T13:31:18.430861Z",
            "url": "https://files.pythonhosted.org/packages/25/cd/4cd9b5c97e9e45d104184db90577c729c698cdf0485585f791c4387a6ff7/FJFormer-0.0.66-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3abb6b2784968ad66a894ea6b9c7669d2cca811a652c037839eea9766f7e741f",
                "md5": "3049cfa202d67855a7234dd6aab19fb4",
                "sha256": "f2f239e88b601ef36501574bf05df4273c515bb3ef6a67534861c0237dc7768d"
            },
            "downloads": -1,
            "filename": "fjformer-0.0.66.tar.gz",
            "has_sig": false,
            "md5_digest": "3049cfa202d67855a7234dd6aab19fb4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 150712,
            "upload_time": "2024-06-12T13:32:40",
            "upload_time_iso_8601": "2024-06-12T13:32:40.965655Z",
            "url": "https://files.pythonhosted.org/packages/3a/bb/6b2784968ad66a894ea6b9c7669d2cca811a652c037839eea9766f7e741f/fjformer-0.0.66.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-12 13:32:40",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "fjformer"
}
        
Elapsed time: 3.96435s