FJFormer


NameFJFormer JSON
Version 0.0.50 PyPI version JSON
download
home_pageNone
SummaryEmbark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!
upload_time2024-04-16 13:18:59
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseApache-2.0
keywords jax torch deep learning machine learning flax xla
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
  <img src="logo/light-logo.png" alt="Alt text"/>
</p>

# FJFormer

Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax
Functions and Utils that elevate your AI endeavors to new heights!

## Overview

FJFormer is a collection of functions and utilities that can help with various tasks when using Flax and JAX. It
includes
checkpoint savers, partitioning tools, and other helpful functions.
The goal of FJFormer is to make your life easier when working with Flax and JAX. Whether you are training a new model,
fine-tuning an existing one, or just exploring the capabilities of these powerful frameworks, FJFormer offers

- FlashAttention on `TPU/GPU` 🧬
- BITComputations for 8,6,4 BIT Flax Models 🤏
- Smart Dataset Loading
- Built-in functions and Loss functions
- GPU-Pallas triton like implementation of `Softmax`, `FlashAttention`, `RMSNorm`, `LayerNorm`
- Distributed and sharding Model Loaders and Checkpoint Savers
- Monitoring Utils for *TPU/GPU/CPU* memory `foot-print`
- Special Optimizers with schedulers and Easy to Use
- Partitioning Utils
- LoRA with `XRapture` 🤠

and A lot of these features are fully documented so i gusse FJFormer has something
to offer, and it's not just a Computation BackEnd for [EasyDel](https://github.com/erfanzar/EasyDel).

checkout for documentations [here](https://erfanzar.github.io/FJFormer/).

## Contributing

FJFormer is an open-source project, and contributions are always welcome! If you have a feature request, bug report, or
just want to help out with development, please check out our GitHub repository and feel free to submit a pull request or
open an issue.

Thank you for using FJFormer, and happy training!


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "FJFormer",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "JAX, Torch, Deep Learning, Machine Learning, Flax, XLA",
    "author": null,
    "author_email": "Erfan Zare Chavoshi <Erfanzare810@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/f6/06/81834b00a3f4b364e64e3d9cbd93cbbf5c98d4feaf1ed86db9874cec11f8/fjformer-0.0.50.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n  <img src=\"logo/light-logo.png\" alt=\"Alt text\"/>\n</p>\n\n# FJFormer\n\nEmbark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax\nFunctions and Utils that elevate your AI endeavors to new heights!\n\n## Overview\n\nFJFormer is a collection of functions and utilities that can help with various tasks when using Flax and JAX. It\nincludes\ncheckpoint savers, partitioning tools, and other helpful functions.\nThe goal of FJFormer is to make your life easier when working with Flax and JAX. Whether you are training a new model,\nfine-tuning an existing one, or just exploring the capabilities of these powerful frameworks, FJFormer offers\n\n- FlashAttention on `TPU/GPU` \ud83e\uddec\n- BITComputations for 8,6,4 BIT Flax Models \ud83e\udd0f\n- Smart Dataset Loading\n- Built-in functions and Loss functions\n- GPU-Pallas triton like implementation of `Softmax`, `FlashAttention`, `RMSNorm`, `LayerNorm`\n- Distributed and sharding Model Loaders and Checkpoint Savers\n- Monitoring Utils for *TPU/GPU/CPU* memory `foot-print`\n- Special Optimizers with schedulers and Easy to Use\n- Partitioning Utils\n- LoRA with `XRapture` \ud83e\udd20\n\nand A lot of these features are fully documented so i gusse FJFormer has something\nto offer, and it's not just a Computation BackEnd for [EasyDel](https://github.com/erfanzar/EasyDel).\n\ncheckout for documentations [here](https://erfanzar.github.io/FJFormer/).\n\n## Contributing\n\nFJFormer is an open-source project, and contributions are always welcome! If you have a feature request, bug report, or\njust want to help out with development, please check out our GitHub repository and feel free to submit a pull request or\nopen an issue.\n\nThank you for using FJFormer, and happy training!\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Embark on a journey of paralleled/unparalleled computational prowess with FJFormer - an arsenal of custom Jax Flax Functions and Utils that elevate your AI endeavors to new heights!",
    "version": "0.0.50",
    "project_urls": null,
    "split_keywords": [
        "jax",
        " torch",
        " deep learning",
        " machine learning",
        " flax",
        " xla"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e24ccf4c67e0ed9e391664f9e9ca3a9b7cee54b100b219e36fc6d009eb111e31",
                "md5": "ee790f82f15af01ee81a59b032474187",
                "sha256": "62a5ab2bc38de8564b414211ee6ef9f4fee20399921eaf0f5ed83e3d0914c449"
            },
            "downloads": -1,
            "filename": "FJFormer-0.0.50-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "ee790f82f15af01ee81a59b032474187",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 170242,
            "upload_time": "2024-04-16T13:18:27",
            "upload_time_iso_8601": "2024-04-16T13:18:27.479273Z",
            "url": "https://files.pythonhosted.org/packages/e2/4c/cf4c67e0ed9e391664f9e9ca3a9b7cee54b100b219e36fc6d009eb111e31/FJFormer-0.0.50-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "f60681834b00a3f4b364e64e3d9cbd93cbbf5c98d4feaf1ed86db9874cec11f8",
                "md5": "01eeced17825b6e4b04356394b137f1a",
                "sha256": "8696714b4d7473eeb6b45abeb8f638904ae00065696b0e5df1e01d25193973b4"
            },
            "downloads": -1,
            "filename": "fjformer-0.0.50.tar.gz",
            "has_sig": false,
            "md5_digest": "01eeced17825b6e4b04356394b137f1a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 148040,
            "upload_time": "2024-04-16T13:18:59",
            "upload_time_iso_8601": "2024-04-16T13:18:59.795170Z",
            "url": "https://files.pythonhosted.org/packages/f6/06/81834b00a3f4b364e64e3d9cbd93cbbf5c98d4feaf1ed86db9874cec11f8/fjformer-0.0.50.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-16 13:18:59",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "fjformer"
}
        
Elapsed time: 0.22965s