bitsandbytes


Namebitsandbytes JSON
Version 0.45.0 PyPI version JSON
download
home_pagehttps://github.com/bitsandbytes-foundation/bitsandbytes
Summaryk-bit optimizers and matrix multiplication routines.
upload_time2024-12-05 16:05:19
maintainerNone
docs_urlNone
authorTim Dettmers
requires_pythonNone
licenseMIT
keywords gpu optimizers optimization 8-bit quantization compression
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # `bitsandbytes`

[![Downloads](https://static.pepy.tech/badge/bitsandbytes)](https://pepy.tech/project/bitsandbytes) [![Downloads](https://static.pepy.tech/badge/bitsandbytes/month)](https://pepy.tech/project/bitsandbytes) [![Downloads](https://static.pepy.tech/badge/bitsandbytes/week)](https://pepy.tech/project/bitsandbytes)

The `bitsandbytes` library is a lightweight Python wrapper around CUDA custom functions, in particular 8-bit optimizers, matrix multiplication (LLM.int8()), and 8 & 4-bit quantization functions.

The library includes quantization primitives for 8-bit & 4-bit operations, through `bitsandbytes.nn.Linear8bitLt` and `bitsandbytes.nn.Linear4bit` and 8-bit optimizers through `bitsandbytes.optim` module.

There are ongoing efforts to support further hardware backends, i.e. Intel CPU + GPU, AMD GPU, Apple Silicon. Windows support is quite far along and is on its way as well.

**Please head to the official documentation page:**

**[https://huggingface.co/docs/bitsandbytes/main](https://huggingface.co/docs/bitsandbytes/main)**

## `bitsandbytes` multi-backend _alpha_ release is out!

🚀 Big news! After months of hard work and incredible community contributions, we're thrilled to announce the **bitsandbytes multi-backend _alpha_ release**! 💥

Now supporting:
- 🔥 **AMD GPUs** (ROCm)
- ⚡ **Intel CPUs** & **GPUs**

We’d love your early feedback! 🙏

👉 [Instructions for your `pip install` here](https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend)

We're super excited about these recent developments and grateful for any constructive input or support that you can give to help us make this a reality (e.g. helping us with the upcoming Apple Silicon backend or reporting bugs). BNB is a community project and we're excited for your collaboration 🤗

## License

`bitsandbytes` is MIT licensed.

We thank Fabio Cannizzo for his work on [FastBinarySearch](https://github.com/fabiocannizzo/FastBinarySearch) which we use for CPU quantization.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/bitsandbytes-foundation/bitsandbytes",
    "name": "bitsandbytes",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "gpu optimizers optimization 8-bit quantization compression",
    "author": "Tim Dettmers",
    "author_email": "dettmers@cs.washington.edu",
    "download_url": null,
    "platform": null,
    "description": "# `bitsandbytes`\n\n[![Downloads](https://static.pepy.tech/badge/bitsandbytes)](https://pepy.tech/project/bitsandbytes) [![Downloads](https://static.pepy.tech/badge/bitsandbytes/month)](https://pepy.tech/project/bitsandbytes) [![Downloads](https://static.pepy.tech/badge/bitsandbytes/week)](https://pepy.tech/project/bitsandbytes)\n\nThe `bitsandbytes` library is a lightweight Python wrapper around CUDA custom functions, in particular 8-bit optimizers, matrix multiplication (LLM.int8()), and 8 & 4-bit quantization functions.\n\nThe library includes quantization primitives for 8-bit & 4-bit operations, through `bitsandbytes.nn.Linear8bitLt` and `bitsandbytes.nn.Linear4bit` and 8-bit optimizers through `bitsandbytes.optim` module.\n\nThere are ongoing efforts to support further hardware backends, i.e. Intel CPU + GPU, AMD GPU, Apple Silicon. Windows support is quite far along and is on its way as well.\n\n**Please head to the official documentation page:**\n\n**[https://huggingface.co/docs/bitsandbytes/main](https://huggingface.co/docs/bitsandbytes/main)**\n\n## `bitsandbytes` multi-backend _alpha_ release is out!\n\n\ud83d\ude80 Big news! After months of hard work and incredible community contributions, we're thrilled to announce the **bitsandbytes multi-backend _alpha_ release**! \ud83d\udca5\n\nNow supporting:\n- \ud83d\udd25 **AMD GPUs** (ROCm)\n- \u26a1 **Intel CPUs** & **GPUs**\n\nWe\u2019d love your early feedback! \ud83d\ude4f\n\n\ud83d\udc49 [Instructions for your `pip install` here](https://huggingface.co/docs/bitsandbytes/main/en/installation#multi-backend)\n\nWe're super excited about these recent developments and grateful for any constructive input or support that you can give to help us make this a reality (e.g. helping us with the upcoming Apple Silicon backend or reporting bugs). BNB is a community project and we're excited for your collaboration \ud83e\udd17\n\n## License\n\n`bitsandbytes` is MIT licensed.\n\nWe thank Fabio Cannizzo for his work on [FastBinarySearch](https://github.com/fabiocannizzo/FastBinarySearch) which we use for CPU quantization.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "k-bit optimizers and matrix multiplication routines.",
    "version": "0.45.0",
    "project_urls": {
        "Homepage": "https://github.com/bitsandbytes-foundation/bitsandbytes"
    },
    "split_keywords": [
        "gpu",
        "optimizers",
        "optimization",
        "8-bit",
        "quantization",
        "compression"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "999af41d252bf8b0bc5969b4dce1274cd04b7ddc541de1060dd27eca680bc1b2",
                "md5": "20231a257c6fe773c8b0327a36e5c84a",
                "sha256": "0f0323de1ff1fdf8383e79bdad1283516a4c05a6fd2b44a363bf4e059422305b"
            },
            "downloads": -1,
            "filename": "bitsandbytes-0.45.0-py3-none-manylinux_2_24_x86_64.whl",
            "has_sig": false,
            "md5_digest": "20231a257c6fe773c8b0327a36e5c84a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 69084267,
            "upload_time": "2024-12-05T16:05:19",
            "upload_time_iso_8601": "2024-12-05T16:05:19.724762Z",
            "url": "https://files.pythonhosted.org/packages/99/9a/f41d252bf8b0bc5969b4dce1274cd04b7ddc541de1060dd27eca680bc1b2/bitsandbytes-0.45.0-py3-none-manylinux_2_24_x86_64.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1e90a2bbb9b5f997b9c9aa9c15ee4adf553ee71053bb942f89fd48d920a1aa9d",
                "md5": "ba1166c87dd842a6689f46befb3f15f5",
                "sha256": "ebbf96e0ecb466716a65ecdeaef3fa1983575447b9ab66b74e5211892507c6ff"
            },
            "downloads": -1,
            "filename": "bitsandbytes-0.45.0-py3-none-win_amd64.whl",
            "has_sig": false,
            "md5_digest": "ba1166c87dd842a6689f46befb3f15f5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 68520043,
            "upload_time": "2024-12-05T16:05:25",
            "upload_time_iso_8601": "2024-12-05T16:05:25.805201Z",
            "url": "https://files.pythonhosted.org/packages/1e/90/a2bbb9b5f997b9c9aa9c15ee4adf553ee71053bb942f89fd48d920a1aa9d/bitsandbytes-0.45.0-py3-none-win_amd64.whl",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-05 16:05:19",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "bitsandbytes-foundation",
    "github_project": "bitsandbytes",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "bitsandbytes"
}
        
Elapsed time: 0.42225s