pytorch-merge


Namepytorch-merge JSON
Version 0.1.2 PyPI version JSON
download
home_page
SummaryMerge LLM weights files that are split into multiple parts
upload_time2023-08-14 19:53:22
maintainer
docs_urlNone
author
requires_python>=3.7
licenseGNU General Public License v3 or later
keywords video gestalt movie summary summarization thumbnails
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # PyTorch Merge

This repository contains a script, **py_merge.py**, that can be used to merge two PyTorch model .bin files into a single model file. This can be useful when you need to combine the weights of two models that have the same architecture and are compatible. The script averages the parameter values of the models for keys that exist in both models.

[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/R6R8K4WLS)

## Prerequisites
Before using this script, make sure you have the following Python packages installed:

* PyTorch
* Transformers

You can install them using pip:

```
pip install pytorch_merge
```

This will automatically install the dependencies (`torch` and `transformers`).

## Usage

Open a terminal, and type:

```
pytorch_merge --help
```

To get the instructions on how to use it.

This tool requires 3 arguments:

* `--config config.json` -- The configuration file for the model architecture you are working with.
* `--bin model1.bin model2.bin model3.bin` -- All the model’s weights .bin files you want to merge. You can merge weights files of one multiparts model, or weights from different models, in which case weights will be averaged. You can specify as many files as you want, they will be merged one after the others in a loop.
* `--output merged_model.bin` -- Where to save the output merged model.

For example:

```
pytorch_merge -c config.json -b model1.bin model2.bin -o merged_model.bin
```

You can now use the merged **merged_model.bin** file with your model architecture.

**Note**: Merging models may not always produce the desired results, especially if the models have different architectures or were trained on different data. 

Use this script only when you are sure that the models are compatible.

## License

This tool was made by Donalda Feith and is licensed under GNU General Public License v3 or later (GPLv3+).

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "pytorch-merge",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.7",
    "maintainer_email": "Donalda Feith <donaldafeith@gmail.com>, Stephen Karl Larroque <lrq3000@gmail.com>",
    "keywords": "video,gestalt,movie,summary,summarization,thumbnails",
    "author": "",
    "author_email": "Donalda Feith <donaldafeith@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/28/81/a81850b25fa3657cd8bf7004721c36fe72aaeb320f1eb14f947f3e54ae7c/pytorch_merge-0.1.2.tar.gz",
    "platform": null,
    "description": "# PyTorch Merge\r\n\r\nThis repository contains a script, **py_merge.py**, that can be used to merge two PyTorch model .bin files into a single model file. This can be useful when you need to combine the weights of two models that have the same architecture and are compatible. The script averages the parameter values of the models for keys that exist in both models.\r\n\r\n[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/R6R8K4WLS)\r\n\r\n## Prerequisites\r\nBefore using this script, make sure you have the following Python packages installed:\r\n\r\n* PyTorch\r\n* Transformers\r\n\r\nYou can install them using pip:\r\n\r\n```\r\npip install pytorch_merge\r\n```\r\n\r\nThis will automatically install the dependencies (`torch` and `transformers`).\r\n\r\n## Usage\r\n\r\nOpen a terminal, and type:\r\n\r\n```\r\npytorch_merge --help\r\n```\r\n\r\nTo get the instructions on how to use it.\r\n\r\nThis tool requires 3 arguments:\r\n\r\n* `--config config.json` -- The configuration file for the model architecture you are working with.\r\n* `--bin model1.bin model2.bin model3.bin` -- All the model\u2019s weights .bin files you want to merge. You can merge weights files of one multiparts model, or weights from different models, in which case weights will be averaged. You can specify as many files as you want, they will be merged one after the others in a loop.\r\n* `--output merged_model.bin` -- Where to save the output merged model.\r\n\r\nFor example:\r\n\r\n```\r\npytorch_merge -c config.json -b model1.bin model2.bin -o merged_model.bin\r\n```\r\n\r\nYou can now use the merged **merged_model.bin** file with your model architecture.\r\n\r\n**Note**: Merging models may not always produce the desired results, especially if the models have different architectures or were trained on different data. \r\n\r\nUse this script only when you are sure that the models are compatible.\r\n\r\n## License\r\n\r\nThis tool was made by Donalda Feith and is licensed under GNU General Public License v3 or later (GPLv3+).\r\n",
    "bugtrack_url": null,
    "license": "GNU General Public License v3 or later",
    "summary": "Merge LLM weights files that are split into multiple parts",
    "version": "0.1.2",
    "project_urls": {
        "Documentation": "https://github.com/donaldafeith/Pytorch_Merge/blob/master/README.md",
        "Download": "https://github.com/donaldafeith/Pytorch_Merge/releases",
        "Homepage": "https://github.com/donaldafeith/Pytorch_Merge",
        "Source": "https://github.com/donaldafeith/Pytorch_Merge",
        "Tracker": "https://github.com/donaldafeith/Pytorch_Merge/issues"
    },
    "split_keywords": [
        "video",
        "gestalt",
        "movie",
        "summary",
        "summarization",
        "thumbnails"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "913710448d8d684e514c03fd3f6c45191e3098830e758f2801c4a794b9e41048",
                "md5": "23e588ffbb9d475a969d3ad4f38c5623",
                "sha256": "806e511a6b04a532dc19e12f0e0c0a6d12d0a1a38f2be35ea88ac1f532689858"
            },
            "downloads": -1,
            "filename": "pytorch_merge-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "23e588ffbb9d475a969d3ad4f38c5623",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.7",
            "size": 17504,
            "upload_time": "2023-08-14T19:53:20",
            "upload_time_iso_8601": "2023-08-14T19:53:20.287862Z",
            "url": "https://files.pythonhosted.org/packages/91/37/10448d8d684e514c03fd3f6c45191e3098830e758f2801c4a794b9e41048/pytorch_merge-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2881a81850b25fa3657cd8bf7004721c36fe72aaeb320f1eb14f947f3e54ae7c",
                "md5": "47649bd34857208e77dff3748fec0f13",
                "sha256": "e6a862143ea489a78e569e3897a1f7e5ae0d8342b3425c10a6a47c2d75d6b660"
            },
            "downloads": -1,
            "filename": "pytorch_merge-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "47649bd34857208e77dff3748fec0f13",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.7",
            "size": 17909,
            "upload_time": "2023-08-14T19:53:22",
            "upload_time_iso_8601": "2023-08-14T19:53:22.200235Z",
            "url": "https://files.pythonhosted.org/packages/28/81/a81850b25fa3657cd8bf7004721c36fe72aaeb320f1eb14f947f3e54ae7c/pytorch_merge-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-08-14 19:53:22",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "donaldafeith",
    "github_project": "Pytorch_Merge",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "pytorch-merge"
}
        
Elapsed time: 0.10157s