mergoo


Namemergoo JSON
Version 0.0.10 PyPI version JSON
download
home_pageNone
SummaryImpelementation of Leeroo LLM composer.
upload_time2024-06-02 07:58:03
maintainerNone
docs_urlNone
authorNone
requires_pythonNone
licenseGNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007 Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/> Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below. 0. Additional Definitions. As used herein, "this License" refers to version 3 of the GNU Lesser General Public License, and the "GNU GPL" refers to version 3 of the GNU General Public License. "The Library" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below. An "Application" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library. A "Combined Work" is a work produced by combining or linking an Application with the Library. The particular version of the Library with which the Combined Work was made is also called the "Linked Version". The "Minimal Corresponding Source" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version. The "Corresponding Application Code" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work. 1. Exception to Section 3 of the GNU GPL. You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL. 2. Conveying Modified Versions. If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version: a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy. 3. Object Code Incorporating Material from Library Header Files. The object code form of an Application may incorporate material from a header file that is part of the Library. You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following: a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the object code with a copy of the GNU GPL and this license document. 4. Combined Works. You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following: a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License. b) Accompany the Combined Work with a copy of the GNU GPL and this license document. c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document. d) Do one of the following: 0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source. 1) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version. e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.) 5. Combined Libraries. You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License. b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 6. Revised Versions of the GNU Lesser General Public License. The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License "or any later version" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation. If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library.
keywords llm compose moe router mixture-of-adapters merge
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1>Mergoo

<img alt='Leeroo logo' src='https://github.com/Leeroo-AI/mergoo/blob/main/static/logo.png?raw=true' width='148' align='right' />

</h1>

[![made-with-python](https://img.shields.io/badge/Made%20with-Python-green.svg)](#python)
[![License: LPGLv3.0](https://img.shields.io/badge/License-LGPLv3.0-yellow.svg)](https://www.gnu.org/licenses/lgpl-3.0.en.html) 
[![Version](https://img.shields.io/pypi/v/mergoo?color=blue)](https://pypi.org/project/mergoo/)



`mergoo` is a library for easily merging multiple LLM experts, and efficiently train the merged LLM. With `mergoo`, you can efficiently integrate the knowledge of different generic or domain-based LLM experts.

<img src='https://github.com/Leeroo-AI/mergoo/blob/main/static/base_light.png?raw=true' />

## 🚀 Features

- Supports several merging methods: **Mixture-of-Experts**, **Mixture-of-Adapters**, and **Layer-wise merging** 
- Flexible merging for each layer
- Base Models supported : [Llama](https://llama.meta.com/)(including LLaMa3), [Mistral](https://huggingface.co/docs/transformers/en/model_doc/mistral), [Phi3](https://huggingface.co/docs/transformers/main/en/model_doc/phi3), and [BERT](https://huggingface.co/docs/transformers/en/model_doc/bert)
- Trainers supported : 🤗 [Trainer](https://huggingface.co/docs/transformers/en/main_classes/trainer), [SFTrainer](https://huggingface.co/docs/trl/en/sft_trainer), [PEFT](https://huggingface.co/docs/peft/en/index)
- Device Supported: CPU, MPS, GPU
- Training choices: Only Router of MoE layers, or Fully fine-tuning of Merged LLM

If you like the project, consider leaving a ⭐️

## Installation
Install by pip:
```
pip install mergoo
```
Install latest unstable version on Github:
```
pip install git+https://github.com/Leeroo-AI/mergoo
```
Install it from the source:
```
git clone https://github.com/Leeroo-AI/mergoo
cd mergoo
pip install -e .
``` 

## Quick Start
### Configuration Setup
Specify the config for merging:  
- ```model_type```: type of base model. choices: ```mistral```, ```llama```, or ```bert```.
- ```num_experts_per_token```: Number of experts for each token of MoE.
- ```experts```: config for experts to merge. includes ```expert_name``` and Hugging Face 🤗```model_id```.
- ```router_layers```: layers chosen for applying Mixture-of-Experts.

#### Fully Fine-tuned Experts
This is a sample config when merging **fully** fine-tuned LLM experts. 
```python
config = {
    "model_type": "mistral",
    "num_experts_per_tok": 2,
    "experts": [
        {"expert_name": "base_expert", "model_id": "mistralai/Mistral-7B-v0.1"},
        {"expert_name": "expert_1", "model_id": "meta-math/MetaMath-Mistral-7B"},
        {"expert_name": "expert_2", "model_id": "ajibawa-2023/Code-Mistral-7B"}
    ],
    "router_layers": ["gate_proj", "up_proj", "down_proj"]
}
```
For the above example, we merged math and code mistral-based experts. Please refer to [this notebook](https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/llama_compose_trainer.ipynb) for further details!

#### Mixture of Adapters (MoE on LoRA)
This is a sample config when merging **LoRA** fine-tuned LLM experts. ```mergoo``` builds a routing layer on top of LoRAs, resulting in a **mixture of adapters**.
```python
config = {
    "model_type": "mistral",
    "num_experts_per_tok": 2,
    "base_model": "mistralai/Mistral-7B-v0.1",
    "experts": [
        {"expert_name": "adapter_1", "model_id": "predibase/customer_support"},
        {"expert_name": "adapter_2", "model_id": "predibase/customer_support_accounts"},
        {"expert_name": "adapter_3", "model_id": "predibase/customer_support_orders"},
        {"expert_name": "adapter_4", "model_id": "predibase/customer_support_payments"}
    ],
}
```
The ```expert_name``` starts with ```adapter``` instead of ```expert```. Please refer to [this notebook](https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/Mistral_lora_compose_trainer.ipynb) for further details!

### Merge Experts 
Following the config setup, ```mergoo``` creates the merged LLM as:
```python
import torch
from mergoo.compose_experts import ComposeExperts

# create checkpoint
model_id = "data/mistral_lora_moe"
expertmerger = ComposeExperts(config, torch_dtype=torch.float16)
expertmerger.compose()
expertmerger.save_checkpoint(model_id)
```

### Load / Finetune Merged Expert
Now, you can easily train the merged LLM with Hugging Face Trainer:
```python
from transformers import Trainer
from mergoo.models.modeling_mistral import MistralForCausalLM

model = MistralForCausalLM.from_pretrained("data/mistral_lora_moe") 
# NOTE: 'gate' / router layers are untrained hence weight loading warning would appeare for them

trainer = Trainer( ... )
trainer.train()
```
## 📚 Learn More:

After finishing the Quick Start guide, you can explore the tutorials below to further familiarize yourself with `mergoo`.

<table>
<thead>
  <tr>
      <th><b>Notebook</b></th>
      <th><b>Details</b></th>
  </tr>
</thead>
<tbody>
    <tr>
    <td><a href="https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/llama_compose_trainer.ipynb"> MoE with fully fine-tuned LLM experts </a></td>
    <td>Build a unifined Mixture-of-Experts model with fully fine-tuned experts. Inspired by <a href=https://arxiv.org/html/2403.07816v1> BTX Research</a> (Meta AI).</td>
  </tr>
  <tr>
    <td><a href="https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/Mistral_lora_compose_trainer.ipynb"> MoE with LoRA fine-tuned experts  </a></td>
    <td> Build a Mixture of Adaptes expert. Inspired by <a href=https://arxiv.org/abs/2402.07148>xlora</a> | <a href=https://arxiv.org/abs/2403.03432>Mixture-of-LoRAs</a> | <a href="https://openreview.net/forum?id=uWvKBCYh4S">MoLE</a> | <a href=https://huggingface.co/papers/2402.05859>PHATGOOSE</a> | <a href=https://arxiv.org/abs/2402.12851>MoELoRA</a></td> 
  </tr>
    <tr>
    <td><a href="https://huggingface.co/blog/alirezamsh/mergoo"> Hugging Face Blog </a></td>
    <td> Deep dive into research details behind the merging methods of mergoo library</td>
  </tr>
  </tr>
    <tr>
    <td><a href="https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/integrate_llama3_experts.ipynb"> LLaMa3-based Experts </a></td>
    <td> Build your own MoE-style LLM experts by integrating LLaMa3-based domain experts</td>
  </tr>
  </tr>
    <tr>
    <td><a href="https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/integrate_phi3_experts.ipynb"> Phi3-based Experts </a></td>
    <td> Create MoE-style LLM architecture by merging Phi3-based fine-tuned models</td>
  </tr>
</tbody>
</table>


## Mergoo Roadmap and Contributing

As an open-source library in a fast evolving domain, we welcome contributions, whether it is introducing new features, enhancing infrastructure, or improving documentation.

Here is `mergoo` roadmap:

- [X] Support MoE for Transformer Block
- [X] Compatibility with Huggingface 🤗
- [X] Support [Trainer](https://huggingface.co/docs/transformers/en/main_classes/trainer), [SFTrainer](https://huggingface.co/docs/trl/en/sft_trainer)
- [X] Loading Unified Checkpoint in BTX
- [X] Feature: Convertible QKV linear layers 
- [X] Feature: Convertible FF linear layers 
- [X] Feature: Routers only for a list of decoder layers indexes
- [X] Sharded [Safetensor](https://github.com/huggingface/safetensors) Saving
- [X] Support experts based on [LLaMa](https://huggingface.co/docs/transformers/en/model_doc/llama) and [Mistral](https://huggingface.co/docs/transformers/en/model_doc/mistral)
- [X] Support experts based on [Phi3](https://huggingface.co/docs/transformers/main/en/model_doc/phi3)
- [X] Support Mixture of LORA Experts (Mixture of Adapters)
- [ ] Router Load balancing loss
- [ ] Lazy loading of tensors for low memory usage in Merging
- [ ] Support other Layer-wise merging methods, including [Mergekit](https://github.com/arcee-ai/mergekit)
- [ ] Support experts based on [Gemma](https://blog.google/technology/developers/gemma-open-models) and [Mamba](https://arxiv.org/abs/2312.00752)
- [ ] Support flash-attention
- [ ] Support Mixture of Depths Transformer

Feel free to suggest new features and/or contribute to `mergoo` roadmap!

Join our community!
-------------
🚀 We love to here your feedback, please join Leeroo community:

- [Twitter](https://twitter.com/LeerooAI)
- [LinkedIn](https://www.linkedin.com/company/leeroo)
- [Website](https://www.leeroo.com)
- [Discord](https://discord.gg/hqVbPNNEZM)

Have a question not listed here? Open a GitHub Issue or send us an [email](support@leeroo.com)!

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "mergoo",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "LLM, compose, MoE, router, mixture-of-adapters, merge",
    "author": null,
    "author_email": "Leeroo Team <support@leeroo.com>",
    "download_url": "https://files.pythonhosted.org/packages/7e/6f/473464a146ea6c08e72876c1f04b6681be7d3da4a6844f63b73510c1d56f/mergoo-0.0.10.tar.gz",
    "platform": null,
    "description": "<h1>Mergoo\n\n<img alt='Leeroo logo' src='https://github.com/Leeroo-AI/mergoo/blob/main/static/logo.png?raw=true' width='148' align='right' />\n\n</h1>\n\n[![made-with-python](https://img.shields.io/badge/Made%20with-Python-green.svg)](#python)\n[![License: LPGLv3.0](https://img.shields.io/badge/License-LGPLv3.0-yellow.svg)](https://www.gnu.org/licenses/lgpl-3.0.en.html) \n[![Version](https://img.shields.io/pypi/v/mergoo?color=blue)](https://pypi.org/project/mergoo/)\n\n\n\n`mergoo` is a library for easily merging multiple LLM experts, and efficiently train the merged LLM. With `mergoo`, you can efficiently integrate the knowledge of different generic or domain-based LLM experts.\n\n<img src='https://github.com/Leeroo-AI/mergoo/blob/main/static/base_light.png?raw=true' />\n\n## \ud83d\ude80 Features\n\n- Supports several merging methods: **Mixture-of-Experts**, **Mixture-of-Adapters**, and **Layer-wise merging** \n- Flexible merging for each layer\n- Base Models supported : [Llama](https://llama.meta.com/)(including LLaMa3), [Mistral](https://huggingface.co/docs/transformers/en/model_doc/mistral), [Phi3](https://huggingface.co/docs/transformers/main/en/model_doc/phi3), and [BERT](https://huggingface.co/docs/transformers/en/model_doc/bert)\n- Trainers supported : \ud83e\udd17 [Trainer](https://huggingface.co/docs/transformers/en/main_classes/trainer), [SFTrainer](https://huggingface.co/docs/trl/en/sft_trainer), [PEFT](https://huggingface.co/docs/peft/en/index)\n- Device Supported: CPU, MPS, GPU\n- Training choices: Only Router of MoE layers, or Fully fine-tuning of Merged LLM\n\nIf you like the project, consider leaving a \u2b50\ufe0f\n\n## Installation\nInstall by pip:\n```\npip install mergoo\n```\nInstall latest unstable version on Github:\n```\npip install git+https://github.com/Leeroo-AI/mergoo\n```\nInstall it from the source:\n```\ngit clone https://github.com/Leeroo-AI/mergoo\ncd mergoo\npip install -e .\n``` \n\n## Quick Start\n### Configuration Setup\nSpecify the config for merging:  \n- ```model_type```: type of base model. choices: ```mistral```, ```llama```, or ```bert```.\n- ```num_experts_per_token```: Number of experts for each token of MoE.\n- ```experts```: config for experts to merge. includes ```expert_name``` and Hugging Face \ud83e\udd17```model_id```.\n- ```router_layers```: layers chosen for applying Mixture-of-Experts.\n\n#### Fully Fine-tuned Experts\nThis is a sample config when merging **fully** fine-tuned LLM experts. \n```python\nconfig = {\n    \"model_type\": \"mistral\",\n    \"num_experts_per_tok\": 2,\n    \"experts\": [\n        {\"expert_name\": \"base_expert\", \"model_id\": \"mistralai/Mistral-7B-v0.1\"},\n        {\"expert_name\": \"expert_1\", \"model_id\": \"meta-math/MetaMath-Mistral-7B\"},\n        {\"expert_name\": \"expert_2\", \"model_id\": \"ajibawa-2023/Code-Mistral-7B\"}\n    ],\n    \"router_layers\": [\"gate_proj\", \"up_proj\", \"down_proj\"]\n}\n```\nFor the above example, we merged math and code mistral-based experts. Please refer to [this notebook](https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/llama_compose_trainer.ipynb) for further details!\n\n#### Mixture of Adapters (MoE on LoRA)\nThis is a sample config when merging **LoRA** fine-tuned LLM experts. ```mergoo``` builds a routing layer on top of LoRAs, resulting in a **mixture of adapters**.\n```python\nconfig = {\n    \"model_type\": \"mistral\",\n    \"num_experts_per_tok\": 2,\n    \"base_model\": \"mistralai/Mistral-7B-v0.1\",\n    \"experts\": [\n        {\"expert_name\": \"adapter_1\", \"model_id\": \"predibase/customer_support\"},\n        {\"expert_name\": \"adapter_2\", \"model_id\": \"predibase/customer_support_accounts\"},\n        {\"expert_name\": \"adapter_3\", \"model_id\": \"predibase/customer_support_orders\"},\n        {\"expert_name\": \"adapter_4\", \"model_id\": \"predibase/customer_support_payments\"}\n    ],\n}\n```\nThe ```expert_name``` starts with ```adapter``` instead of ```expert```. Please refer to [this notebook](https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/Mistral_lora_compose_trainer.ipynb) for further details!\n\n### Merge Experts \nFollowing the config setup, ```mergoo``` creates the merged LLM as:\n```python\nimport torch\nfrom mergoo.compose_experts import ComposeExperts\n\n# create checkpoint\nmodel_id = \"data/mistral_lora_moe\"\nexpertmerger = ComposeExperts(config, torch_dtype=torch.float16)\nexpertmerger.compose()\nexpertmerger.save_checkpoint(model_id)\n```\n\n### Load / Finetune Merged Expert\nNow, you can easily train the merged LLM with Hugging Face Trainer:\n```python\nfrom transformers import Trainer\nfrom mergoo.models.modeling_mistral import MistralForCausalLM\n\nmodel = MistralForCausalLM.from_pretrained(\"data/mistral_lora_moe\") \n# NOTE: 'gate' / router layers are untrained hence weight loading warning would appeare for them\n\ntrainer = Trainer( ... )\ntrainer.train()\n```\n## \ud83d\udcda Learn More:\n\nAfter finishing the Quick Start guide, you can explore the tutorials below to further familiarize yourself with `mergoo`.\n\n<table>\n<thead>\n  <tr>\n      <th><b>Notebook</b></th>\n      <th><b>Details</b></th>\n  </tr>\n</thead>\n<tbody>\n    <tr>\n    <td><a href=\"https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/llama_compose_trainer.ipynb\"> MoE with fully fine-tuned LLM experts </a></td>\n    <td>Build a unifined Mixture-of-Experts model with fully fine-tuned experts. Inspired by <a href=https://arxiv.org/html/2403.07816v1> BTX Research</a> (Meta AI).</td>\n  </tr>\n  <tr>\n    <td><a href=\"https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/Mistral_lora_compose_trainer.ipynb\"> MoE with LoRA fine-tuned experts  </a></td>\n    <td> Build a Mixture of Adaptes expert. Inspired by <a href=https://arxiv.org/abs/2402.07148>xlora</a> | <a href=https://arxiv.org/abs/2403.03432>Mixture-of-LoRAs</a> | <a href=\"https://openreview.net/forum?id=uWvKBCYh4S\">MoLE</a> | <a href=https://huggingface.co/papers/2402.05859>PHATGOOSE</a> | <a href=https://arxiv.org/abs/2402.12851>MoELoRA</a></td> \n  </tr>\n    <tr>\n    <td><a href=\"https://huggingface.co/blog/alirezamsh/mergoo\"> Hugging Face Blog </a></td>\n    <td> Deep dive into research details behind the merging methods of mergoo library</td>\n  </tr>\n  </tr>\n    <tr>\n    <td><a href=\"https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/integrate_llama3_experts.ipynb\"> LLaMa3-based Experts </a></td>\n    <td> Build your own MoE-style LLM experts by integrating LLaMa3-based domain experts</td>\n  </tr>\n  </tr>\n    <tr>\n    <td><a href=\"https://github.com/Leeroo-AI/mergoo/blob/main/notebooks/integrate_phi3_experts.ipynb\"> Phi3-based Experts </a></td>\n    <td> Create MoE-style LLM architecture by merging Phi3-based fine-tuned models</td>\n  </tr>\n</tbody>\n</table>\n\n\n## Mergoo Roadmap and Contributing\n\nAs an open-source library in a fast evolving domain, we welcome contributions, whether it is introducing new features, enhancing infrastructure, or improving documentation.\n\nHere is `mergoo` roadmap:\n\n- [X] Support MoE for Transformer Block\n- [X] Compatibility with Huggingface \ud83e\udd17\n- [X] Support [Trainer](https://huggingface.co/docs/transformers/en/main_classes/trainer), [SFTrainer](https://huggingface.co/docs/trl/en/sft_trainer)\n- [X] Loading Unified Checkpoint in BTX\n- [X] Feature: Convertible QKV linear layers \n- [X] Feature: Convertible FF linear layers \n- [X] Feature: Routers only for a list of decoder layers indexes\n- [X] Sharded [Safetensor](https://github.com/huggingface/safetensors) Saving\n- [X] Support experts based on [LLaMa](https://huggingface.co/docs/transformers/en/model_doc/llama) and [Mistral](https://huggingface.co/docs/transformers/en/model_doc/mistral)\n- [X] Support experts based on [Phi3](https://huggingface.co/docs/transformers/main/en/model_doc/phi3)\n- [X] Support Mixture of LORA Experts (Mixture of Adapters)\n- [ ] Router Load balancing loss\n- [ ] Lazy loading of tensors for low memory usage in Merging\n- [ ] Support other Layer-wise merging methods, including [Mergekit](https://github.com/arcee-ai/mergekit)\n- [ ] Support experts based on [Gemma](https://blog.google/technology/developers/gemma-open-models) and [Mamba](https://arxiv.org/abs/2312.00752)\n- [ ] Support flash-attention\n- [ ] Support Mixture of Depths Transformer\n\nFeel free to suggest new features and/or contribute to `mergoo` roadmap!\n\nJoin our community!\n-------------\n\ud83d\ude80 We love to here your feedback, please join Leeroo community:\n\n- [Twitter](https://twitter.com/LeerooAI)\n- [LinkedIn](https://www.linkedin.com/company/leeroo)\n- [Website](https://www.leeroo.com)\n- [Discord](https://discord.gg/hqVbPNNEZM)\n\nHave a question not listed here? Open a GitHub Issue or send us an [email](support@leeroo.com)!\n",
    "bugtrack_url": null,
    "license": "GNU LESSER GENERAL PUBLIC LICENSE Version 3, 29 June 2007  Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/> Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed.   This version of the GNU Lesser General Public License incorporates the terms and conditions of version 3 of the GNU General Public License, supplemented by the additional permissions listed below.  0. Additional Definitions.  As used herein, \"this License\" refers to version 3 of the GNU Lesser General Public License, and the \"GNU GPL\" refers to version 3 of the GNU General Public License.  \"The Library\" refers to a covered work governed by this License, other than an Application or a Combined Work as defined below.  An \"Application\" is any work that makes use of an interface provided by the Library, but which is not otherwise based on the Library. Defining a subclass of a class defined by the Library is deemed a mode of using an interface provided by the Library.  A \"Combined Work\" is a work produced by combining or linking an Application with the Library.  The particular version of the Library with which the Combined Work was made is also called the \"Linked Version\".  The \"Minimal Corresponding Source\" for a Combined Work means the Corresponding Source for the Combined Work, excluding any source code for portions of the Combined Work that, considered in isolation, are based on the Application, and not on the Linked Version.  The \"Corresponding Application Code\" for a Combined Work means the object code and/or source code for the Application, including any data and utility programs needed for reproducing the Combined Work from the Application, but excluding the System Libraries of the Combined Work.  1. Exception to Section 3 of the GNU GPL.  You may convey a covered work under sections 3 and 4 of this License without being bound by section 3 of the GNU GPL.  2. Conveying Modified Versions.  If you modify a copy of the Library, and, in your modifications, a facility refers to a function or data to be supplied by an Application that uses the facility (other than as an argument passed when the facility is invoked), then you may convey a copy of the modified version:  a) under this License, provided that you make a good faith effort to ensure that, in the event an Application does not supply the function or data, the facility still operates, and performs whatever part of its purpose remains meaningful, or  b) under the GNU GPL, with none of the additional permissions of this License applicable to that copy.  3. Object Code Incorporating Material from Library Header Files.  The object code form of an Application may incorporate material from a header file that is part of the Library.  You may convey such object code under terms of your choice, provided that, if the incorporated material is not limited to numerical parameters, data structure layouts and accessors, or small macros, inline functions and templates (ten or fewer lines in length), you do both of the following:  a) Give prominent notice with each copy of the object code that the Library is used in it and that the Library and its use are covered by this License.  b) Accompany the object code with a copy of the GNU GPL and this license document.  4. Combined Works.  You may convey a Combined Work under terms of your choice that, taken together, effectively do not restrict modification of the portions of the Library contained in the Combined Work and reverse engineering for debugging such modifications, if you also do each of the following:  a) Give prominent notice with each copy of the Combined Work that the Library is used in it and that the Library and its use are covered by this License.  b) Accompany the Combined Work with a copy of the GNU GPL and this license document.  c) For a Combined Work that displays copyright notices during execution, include the copyright notice for the Library among these notices, as well as a reference directing the user to the copies of the GNU GPL and this license document.  d) Do one of the following:  0) Convey the Minimal Corresponding Source under the terms of this License, and the Corresponding Application Code in a form suitable for, and under terms that permit, the user to recombine or relink the Application with a modified version of the Linked Version to produce a modified Combined Work, in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.  1) Use a suitable shared library mechanism for linking with the Library.  A suitable mechanism is one that (a) uses at run time a copy of the Library already present on the user's computer system, and (b) will operate properly with a modified version of the Library that is interface-compatible with the Linked Version.  e) Provide Installation Information, but only if you would otherwise be required to provide such information under section 6 of the GNU GPL, and only to the extent that such information is necessary to install and execute a modified version of the Combined Work produced by recombining or relinking the Application with a modified version of the Linked Version. (If you use option 4d0, the Installation Information must accompany the Minimal Corresponding Source and Corresponding Application Code. If you use option 4d1, you must provide the Installation Information in the manner specified by section 6 of the GNU GPL for conveying Corresponding Source.)  5. Combined Libraries.  You may place library facilities that are a work based on the Library side by side in a single library together with other library facilities that are not Applications and are not covered by this License, and convey such a combined library under terms of your choice, if you do both of the following:  a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities, conveyed under the terms of this License.  b) Give prominent notice with the combined library that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work.  6. Revised Versions of the GNU Lesser General Public License.  The Free Software Foundation may publish revised and/or new versions of the GNU Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns.  Each version is given a distinguishing version number. If the Library as you received it specifies that a certain numbered version of the GNU Lesser General Public License \"or any later version\" applies to it, you have the option of following the terms and conditions either of that published version or of any later version published by the Free Software Foundation. If the Library as you received it does not specify a version number of the GNU Lesser General Public License, you may choose any version of the GNU Lesser General Public License ever published by the Free Software Foundation.  If the Library as you received it specifies that a proxy can decide whether future versions of the GNU Lesser General Public License shall apply, that proxy's public statement of acceptance of any version is permanent authorization for you to choose that version for the Library.",
    "summary": "Impelementation of Leeroo LLM composer.",
    "version": "0.0.10",
    "project_urls": {
        "homepage": "https://github.com/Leeroo-AI/mergoo",
        "repository": "https://github.com/Leeroo-AI/mergoo"
    },
    "split_keywords": [
        "llm",
        " compose",
        " moe",
        " router",
        " mixture-of-adapters",
        " merge"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6405f231cc63d6033193d92f94965317212cc33379753f15f1e511f5c546a610",
                "md5": "04581e58c99b490d729a45ade58bf944",
                "sha256": "452d39875577710a11abd579b8233e0d436c7a3cfffdbade2bb5773de56aaf86"
            },
            "downloads": -1,
            "filename": "mergoo-0.0.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "04581e58c99b490d729a45ade58bf944",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 84151,
            "upload_time": "2024-06-02T07:58:01",
            "upload_time_iso_8601": "2024-06-02T07:58:01.383204Z",
            "url": "https://files.pythonhosted.org/packages/64/05/f231cc63d6033193d92f94965317212cc33379753f15f1e511f5c546a610/mergoo-0.0.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7e6f473464a146ea6c08e72876c1f04b6681be7d3da4a6844f63b73510c1d56f",
                "md5": "6d38b0cab8dd504f8600cc8d761a25b4",
                "sha256": "47d300b24df0c0b2dc3589ee95b6a63515c835fb0877651ef0d4678cb9c68a89"
            },
            "downloads": -1,
            "filename": "mergoo-0.0.10.tar.gz",
            "has_sig": false,
            "md5_digest": "6d38b0cab8dd504f8600cc8d761a25b4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 82212,
            "upload_time": "2024-06-02T07:58:03",
            "upload_time_iso_8601": "2024-06-02T07:58:03.698046Z",
            "url": "https://files.pythonhosted.org/packages/7e/6f/473464a146ea6c08e72876c1f04b6681be7d3da4a6844f63b73510c1d56f/mergoo-0.0.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-02 07:58:03",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Leeroo-AI",
    "github_project": "mergoo",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "mergoo"
}
        
Elapsed time: 0.23907s