Name | checkpoint-tools JSON |
Version |
0.1.0
JSON |
| download |
home_page | None |
Summary | Useful tools for working with pytorch checkpoints and popular machine learning libraries. |
upload_time | 2025-01-16 00:56:52 |
maintainer | None |
docs_url | None |
author | Benjamin Paine |
requires_python | >=3.8.0 |
license | None |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Checkpoint Tools
A small collection of helpful command-line tools for working with PyTorch checkpoints.
<div align="center">
<img src="https://img.shields.io/static/v1?label=painebenjamin&message=checkpoint-tools&color=00519c&logo=github" alt="painebenjamin - checkpoint-tools">
<img src="https://img.shields.io/github/stars/painebenjamin/checkpoint-tools?style=social" alt="stars - checkpoint-tools">
<img src="https://img.shields.io/github/forks/painebenjamin/checkpoint-tools?style=social" alt="forks - checkpoint-tools"><br />
<a href="https://github.com/painebenjamin/checkpoint-tools/blob/main/LICENSE"><img src="https://img.shields.io/badge/License-Apache-00519c" alt="License"></a>
<a href="https://pypi.org/project/checkpoint-tools"><img alt="PyPI - Version" src="https://img.shields.io/pypi/v/checkpoint-tools?color=00519c"></a>
<a href="https://pypistats.org/packages/checkpoint-tools"><img alt="PyPI - Downloads" src="https://img.shields.io/pypi/dd/checkpoint-tools?logo=python&logoColor=white&color=00519c"></a>
</div>
# Available Commands
## `metadata`
```
Usage: checkpoint-tools metadata [OPTIONS] INPUT_FILE
Print metadata of a SafeTensors checkpoint.
Options:
--help Show this message and exit.
```
## `convert`
```
Usage: checkpoint-tools convert [OPTIONS] INPUT_FILE
Convert a PyTorch/SafeTensors checkpoint to SafeTensors format, optionally
changing the precision of the (floating point) tensors.
Writes the file with a suffix appropriate for the precision of the tensors
in the state dictionary.
Options:
--float8-e5m2-fn-uz Convert all floating point tensors to
float8-e5m2-fn-uz (5 exponent bits, 2 mantissa
bits, finite numbers only, no negative zero)
--float8-e5m2 Convert all floating point tensors to
float8-e5m2 (5 exponent bits, 2 mantissa bits)
--float8-e4m3-fn-uz Convert all floating point tensors to
float8-e4m3-fn-uz (4 exponent bits, 3 mantissa
bits, finite numbers only, no negative zero)
--float8-e4m3-fn Convert all floating point tensors to
float8-e4m3-fn (4 exponent bits, 3 mantissa
bits, finite numbers only)
--bfloat16 Convert all floating point tensors to bfloat16
--float16 Convert all floating point tensors to float16
--full Leave all tensors as full precision
--replace-key TEXT Keys to replace, use `:` to separate old and
new key parts
--ignore-key TEXT Keys to ignore
--overwrite / --no-overwrite Overwrite output file if it exists
--name TEXT Output file name
--help Show this message and exit.
```
## `convert-to-diffusers`
```
Usage: checkpoint-tools convert-to-diffusers [OPTIONS] INPUT_FILE
Convert a non-diffusers PyTorch/SafeTensors checkpoint to Diffusers format
in SafeTensors.
Writes the file with a suffix appropriate for the precision of the tensors
in the state dictionary.
Supported model types:
Stable Diffusion 1.5
Stable Diffusion XL
Stable Diffusion 3.5
FLUX.Dev
FLUX.Schnell
Options:
--name TEXT Output file name
--model-type TEXT Model type, default inferred from state
dictionary
--int8 Quantize all floating point tensors to 8-bit
integer using bitsandbytes
--nf4 Quantize all floating point tensors to
normalized float4 using bitsandbytes
--float8-e5m2-fn-uz Convert all floating point tensors to
float8-e5m2-fn-uz (5 exponent bits, 2 mantissa
bits, finite numbers only, no negative zero)
--float8-e5m2 Convert all floating point tensors to
float8-e5m2 (5 exponent bits, 2 mantissa bits)
--float8-e4m3-fn-uz Convert all floating point tensors to
float8-e4m3-fn-uz (4 exponent bits, 3 mantissa
bits, finite numbers only, no negative zero)
--float8-e4m3-fn Convert all floating point tensors to
float8-e4m3-fn (4 exponent bits, 3 mantissa
bits, finite numbers only)
--bfloat16 Convert all floating point tensors to bfloat16
--float16 Convert all floating point tensors to float16
--full Leave all tensors as full precision
--replace-key TEXT Keys to replace, use `:` to separate old and
new key parts
--ignore-key TEXT Keys to ignore
--overwrite / --no-overwrite Overwrite output file if it exists
--name TEXT Output file name
--help Show this message and exit.
```
## `combine`
```
Usage: checkpoint-tools combine [OPTIONS] [INPUT_FILES]...
Combine multiple checkpoints into a single checkpoint.
Options:
--float8-e5m2-fn-uz Convert all floating point tensors to
float8-e5m2-fn-uz (5 exponent bits, 2 mantissa
bits, finite numbers only, no negative zero)
--float8-e5m2 Convert all floating point tensors to
float8-e5m2 (5 exponent bits, 2 mantissa bits)
--float8-e4m3-fn-uz Convert all floating point tensors to
float8-e4m3-fn-uz (4 exponent bits, 3 mantissa
bits, finite numbers only, no negative zero)
--float8-e4m3-fn Convert all floating point tensors to
float8-e4m3-fn (4 exponent bits, 3 mantissa
bits, finite numbers only)
--bfloat16 Convert all floating point tensors to bfloat16
--float16 Convert all floating point tensors to float16
--full Leave all tensors as full precision
--replace-key TEXT Keys to replace, use `:` to separate old and
new key parts
--ignore-key TEXT Keys to ignore
--overwrite / --no-overwrite Overwrite output file if it exists
--name TEXT Output file name
--help Show this message and exit.
```
Raw data
{
"_id": null,
"home_page": null,
"name": "checkpoint-tools",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8.0",
"maintainer_email": null,
"keywords": null,
"author": "Benjamin Paine",
"author_email": "painebenjamin@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/da/a8/36fe09b0d8e35d19292a232a4900df8ce38ca59f6d7cfcb3a481c52906ef/checkpoint-tools-0.1.0.tar.gz",
"platform": null,
"description": "# Checkpoint Tools\n\nA small collection of helpful command-line tools for working with PyTorch checkpoints.\n\n<div align=\"center\">\n <img src=\"https://img.shields.io/static/v1?label=painebenjamin&message=checkpoint-tools&color=00519c&logo=github\" alt=\"painebenjamin - checkpoint-tools\">\n <img src=\"https://img.shields.io/github/stars/painebenjamin/checkpoint-tools?style=social\" alt=\"stars - checkpoint-tools\">\n <img src=\"https://img.shields.io/github/forks/painebenjamin/checkpoint-tools?style=social\" alt=\"forks - checkpoint-tools\"><br />\n <a href=\"https://github.com/painebenjamin/checkpoint-tools/blob/main/LICENSE\"><img src=\"https://img.shields.io/badge/License-Apache-00519c\" alt=\"License\"></a>\n <a href=\"https://pypi.org/project/checkpoint-tools\"><img alt=\"PyPI - Version\" src=\"https://img.shields.io/pypi/v/checkpoint-tools?color=00519c\"></a>\n <a href=\"https://pypistats.org/packages/checkpoint-tools\"><img alt=\"PyPI - Downloads\" src=\"https://img.shields.io/pypi/dd/checkpoint-tools?logo=python&logoColor=white&color=00519c\"></a>\n</div>\n\n# Available Commands\n\n## `metadata`\n\n```\nUsage: checkpoint-tools metadata [OPTIONS] INPUT_FILE\n\n Print metadata of a SafeTensors checkpoint.\n\nOptions:\n --help Show this message and exit.\n```\n\n## `convert`\n\n```\nUsage: checkpoint-tools convert [OPTIONS] INPUT_FILE\n\n Convert a PyTorch/SafeTensors checkpoint to SafeTensors format, optionally\n changing the precision of the (floating point) tensors.\n\n Writes the file with a suffix appropriate for the precision of the tensors\n in the state dictionary.\n\nOptions:\n --float8-e5m2-fn-uz Convert all floating point tensors to\n float8-e5m2-fn-uz (5 exponent bits, 2 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e5m2 Convert all floating point tensors to\n float8-e5m2 (5 exponent bits, 2 mantissa bits)\n --float8-e4m3-fn-uz Convert all floating point tensors to\n float8-e4m3-fn-uz (4 exponent bits, 3 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e4m3-fn Convert all floating point tensors to\n float8-e4m3-fn (4 exponent bits, 3 mantissa\n bits, finite numbers only)\n --bfloat16 Convert all floating point tensors to bfloat16\n --float16 Convert all floating point tensors to float16\n --full Leave all tensors as full precision\n --replace-key TEXT Keys to replace, use `:` to separate old and\n new key parts\n --ignore-key TEXT Keys to ignore\n --overwrite / --no-overwrite Overwrite output file if it exists\n --name TEXT Output file name\n --help Show this message and exit.\n```\n\n## `convert-to-diffusers`\n\n```\nUsage: checkpoint-tools convert-to-diffusers [OPTIONS] INPUT_FILE\n\n Convert a non-diffusers PyTorch/SafeTensors checkpoint to Diffusers format\n in SafeTensors.\n\n Writes the file with a suffix appropriate for the precision of the tensors\n in the state dictionary.\n\n Supported model types:\n\n Stable Diffusion 1.5\n\n Stable Diffusion XL\n\n Stable Diffusion 3.5\n\n FLUX.Dev\n\n FLUX.Schnell\n\nOptions:\n --name TEXT Output file name\n --model-type TEXT Model type, default inferred from state\n dictionary\n --int8 Quantize all floating point tensors to 8-bit\n integer using bitsandbytes\n --nf4 Quantize all floating point tensors to\n normalized float4 using bitsandbytes\n --float8-e5m2-fn-uz Convert all floating point tensors to\n float8-e5m2-fn-uz (5 exponent bits, 2 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e5m2 Convert all floating point tensors to\n float8-e5m2 (5 exponent bits, 2 mantissa bits)\n --float8-e4m3-fn-uz Convert all floating point tensors to\n float8-e4m3-fn-uz (4 exponent bits, 3 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e4m3-fn Convert all floating point tensors to\n float8-e4m3-fn (4 exponent bits, 3 mantissa\n bits, finite numbers only)\n --bfloat16 Convert all floating point tensors to bfloat16\n --float16 Convert all floating point tensors to float16\n --full Leave all tensors as full precision\n --replace-key TEXT Keys to replace, use `:` to separate old and\n new key parts\n --ignore-key TEXT Keys to ignore\n --overwrite / --no-overwrite Overwrite output file if it exists\n --name TEXT Output file name\n --help Show this message and exit.\n```\n\n## `combine`\n\n```\nUsage: checkpoint-tools combine [OPTIONS] [INPUT_FILES]...\n\n Combine multiple checkpoints into a single checkpoint.\n\nOptions:\n --float8-e5m2-fn-uz Convert all floating point tensors to\n float8-e5m2-fn-uz (5 exponent bits, 2 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e5m2 Convert all floating point tensors to\n float8-e5m2 (5 exponent bits, 2 mantissa bits)\n --float8-e4m3-fn-uz Convert all floating point tensors to\n float8-e4m3-fn-uz (4 exponent bits, 3 mantissa\n bits, finite numbers only, no negative zero)\n --float8-e4m3-fn Convert all floating point tensors to\n float8-e4m3-fn (4 exponent bits, 3 mantissa\n bits, finite numbers only)\n --bfloat16 Convert all floating point tensors to bfloat16\n --float16 Convert all floating point tensors to float16\n --full Leave all tensors as full precision\n --replace-key TEXT Keys to replace, use `:` to separate old and\n new key parts\n --ignore-key TEXT Keys to ignore\n --overwrite / --no-overwrite Overwrite output file if it exists\n --name TEXT Output file name\n --help Show this message and exit.\n```\n\n\n",
"bugtrack_url": null,
"license": null,
"summary": "Useful tools for working with pytorch checkpoints and popular machine learning libraries.",
"version": "0.1.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "daa836fe09b0d8e35d19292a232a4900df8ce38ca59f6d7cfcb3a481c52906ef",
"md5": "ace96e885df0f0cf27e321cd9879737a",
"sha256": "9796583fd8f10896393a986e499c21a8366ae269e7a29337f317fced59c6a135"
},
"downloads": -1,
"filename": "checkpoint-tools-0.1.0.tar.gz",
"has_sig": false,
"md5_digest": "ace96e885df0f0cf27e321cd9879737a",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.0",
"size": 14974,
"upload_time": "2025-01-16T00:56:52",
"upload_time_iso_8601": "2025-01-16T00:56:52.652471Z",
"url": "https://files.pythonhosted.org/packages/da/a8/36fe09b0d8e35d19292a232a4900df8ce38ca59f6d7cfcb3a481c52906ef/checkpoint-tools-0.1.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-16 00:56:52",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "checkpoint-tools"
}