mixtral.py


Namemixtral.py JSON
Version 0.0.1 PyPI version JSON
download
home_pagehttps://github.com/ibnaleem/mixtral.py
SummaryA Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.
upload_time2024-02-06 12:40:00
maintainer
docs_urlNone
authorIbn Aleem
requires_python>=3.6
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # mixtral.py
A Python module for running the [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) language model with customisable precision and attention mechanisms. [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1), being a pretrained base model, lacks moderation mechanisms. [Read more here.](https://mistral.ai/news/mixtral-of-experts/)

## Installation
```
pip/pip3 install mixtral.py
```

## Usage
### Run the model
```python
import mixtral
mixtral = Mixtral()
print("Running the model:")
print(mixtral.generate_text("..."))
```
### In half-precision
```python
import mixtral
print("Running the model in half precision:")
mixtral_half_precision = Mixtral(use_half_precision=True)
print(mixtral_half_precision.generate_text("..."))
```
### Lower precision using (8-bit & 4-bit) using `bitsandbytes`
```python
import mixtral
print("Running the model in lower precision using (8-bit & 4-bit) using bitsandbytes:")
mixtral_4bit = Mixtral(load_in_4bit=True)
print(mixtral_4bit.generate_text("..."))
```
### Load the model with Flash Attention 2
```python
import mixtral
print("Load the model with Flash Attention 2:")
mixtral_flash_attention_2 = Mixtral(use_flash_attention_2=True)
print(mixtral_flash_attention_2.generate_text("..."))
```
## Hardware Requirements
- minimum 100GB GPU RAM ([Mistral AI](https://docs.mistral.ai/models/))
- Hardware requirements after fine-tuning can be found in the discussion [here](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/discussions/3)
## Licenses
- mixtral.py is under the [GNU General Public License v3](https://github.com/ibnaleem/mixtral.py/blob/main/LICENSE)
- Mixtral 8x7b is under the [Apache 2.0 License](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)
## Contributing
I welcome contributions from the community and appreciate the time and effort put into making [mixtral.py](https://github.com/ibnaleem/mixtral.py) better. To contribute, please follow the guidelines and steps outlined below:

> Note: **_Your pull request will be closed if you do not specify the changes you've made._**

### Fork the Repository
Start by [forking this repository](https://github.com/ibnaleem/mixtral.py/fork). You can do this by clicking on the ["Fork"](https://github.com/ibnaleem/mixtral.py/fork) button located at the top right corner of the GitHub page. This will create a personal copy of the repository under your own GitHub account.

### Clone the Repository
Next, clone the forked repository to your local machine using the following command:
```bash
$ git clone https://github.com/yourusername/mixtral.py.git
```
Navigate to the cloned directory:
```bash 
$ cd mixtral.py
```
### Create a New Branch
Before making any changes, it's recommended to create a new branch. This ensures that your changes won't interfere with other contributions and keeps the main branch clean. Use the following command to create and switch to a new branch:
```bash
$ git checkout -b branch-name
```
### Make the Desired Changes
Now, you can proceed to make your desired changes to the project. Whether it's fixing bugs, adding new features, improving documentation, or optimizing code, your efforts will be instrumental in enhancing the project.

### Commit and Push Changes
Once you have made the necessary changes, commit your work using the following commands:
```bash
$ git add .
$ git commit -m "Your commit message"
```
Push the changes to your forked repository:
```bash
$ git push origin branch-name
```
### Submit a Pull Request
Head over to the [original repository](https://github.com/ibnaleem/mixtral.py) on GitHub and go to the ["Pull requests"](https://github.com/ibnaleem/mixtral/pulls) tab.
1. Click on the "New pull request" button.
2. Select your forked repository and the branch containing your changes.
3. Provide a clear and informative title for your pull request, and use the description box to explain the modifications you have made. **_Your pull request will be closed if you do not specify the changes you've made._**
4. Finally, click on the "Create pull request" button to submit your changes.

## Verifying Signatures
### Import PGP Key into Keyring
```
gpg --keyserver 185.125.188.27 --recv-keys 20247EC023F2769E66181C0F581B4A2A862BBADE
```
or
```
wget https://github.com/ibnaleem/ibnaleem/blob/main/public_key.asc
```
### Download Signature
The signatures (.asc and .sig) can be found in the [/sig directory.](https://github.com/ibnaleem/mixtral.py/tree/main/sig) Download either of them. Open [an issue](https://github.com/ibnaleem/mixtral.py/issues) with the title "invalid signature/wrong signature/corrupt signature" for any issues regarding my signatures.
### Sign My Key 
```
gpg --sign-key 20247EC023F2769E66181C0F581B4A2A862BBADE
gpg --send-keys 20247EC023F2769E66181C0F581B4A2A862BBADE
```
If you cannot upload your signature, see ["*gpg: keyserver receive failed: No route to host*"](https://stackoverflow.com/questions/54801274/gpg-keyserver-receive-failed-no-route-to-host-stack-overflow)
### Verify
```
gpg --verify mixtral.py.asc mixtral.py
```
Desired output:
```
gpg: Signature made Tue  6 Feb 10:27:34 2024 GMT
gpg:                using RSA key 20247EC023F2769E66181C0F581B4A2A862BBADE
gpg: Good signature from "Ibn Aleem <ibnaleem@outlook.com>" [ultimate]
```
## Acknowledgements
- [PyTorch](https://pytorch.org)
- [Transformers library](https://pypi.org/project/transformers/)
- [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ibnaleem/mixtral.py",
    "name": "mixtral.py",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": "",
    "keywords": "",
    "author": "Ibn Aleem",
    "author_email": "ibnaleem@outlook.com",
    "download_url": "https://files.pythonhosted.org/packages/55/b8/2005a6c4f3f9aa86771bdef8eb5fee77cb1b027a2ccaf0a5e6e5e824b631/mixtral.py-0.0.1.tar.gz",
    "platform": null,
    "description": "# mixtral.py\nA Python module for running the [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1) language model with customisable precision and attention mechanisms. [Mixtral-8x7B](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1), being a pretrained base model, lacks moderation mechanisms. [Read more here.](https://mistral.ai/news/mixtral-of-experts/)\n\n## Installation\n```\npip/pip3 install mixtral.py\n```\n\n## Usage\n### Run the model\n```python\nimport mixtral\nmixtral = Mixtral()\nprint(\"Running the model:\")\nprint(mixtral.generate_text(\"...\"))\n```\n### In half-precision\n```python\nimport mixtral\nprint(\"Running the model in half precision:\")\nmixtral_half_precision = Mixtral(use_half_precision=True)\nprint(mixtral_half_precision.generate_text(\"...\"))\n```\n### Lower precision using (8-bit & 4-bit) using `bitsandbytes`\n```python\nimport mixtral\nprint(\"Running the model in lower precision using (8-bit & 4-bit) using bitsandbytes:\")\nmixtral_4bit = Mixtral(load_in_4bit=True)\nprint(mixtral_4bit.generate_text(\"...\"))\n```\n### Load the model with Flash Attention 2\n```python\nimport mixtral\nprint(\"Load the model with Flash Attention 2:\")\nmixtral_flash_attention_2 = Mixtral(use_flash_attention_2=True)\nprint(mixtral_flash_attention_2.generate_text(\"...\"))\n```\n## Hardware Requirements\n- minimum 100GB GPU RAM ([Mistral AI](https://docs.mistral.ai/models/))\n- Hardware requirements after fine-tuning can be found in the discussion [here](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1/discussions/3)\n## Licenses\n- mixtral.py is under the [GNU General Public License v3](https://github.com/ibnaleem/mixtral.py/blob/main/LICENSE)\n- Mixtral 8x7b is under the [Apache 2.0 License](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)\n## Contributing\nI welcome contributions from the community and appreciate the time and effort put into making [mixtral.py](https://github.com/ibnaleem/mixtral.py) better. To contribute, please follow the guidelines and steps outlined below:\n\n> Note: **_Your pull request will be closed if you do not specify the changes you've made._**\n\n### Fork the Repository\nStart by [forking this repository](https://github.com/ibnaleem/mixtral.py/fork). You can do this by clicking on the [\"Fork\"](https://github.com/ibnaleem/mixtral.py/fork) button located at the top right corner of the GitHub page. This will create a personal copy of the repository under your own GitHub account.\n\n### Clone the Repository\nNext, clone the forked repository to your local machine using the following command:\n```bash\n$ git clone https://github.com/yourusername/mixtral.py.git\n```\nNavigate to the cloned directory:\n```bash \n$ cd mixtral.py\n```\n### Create a New Branch\nBefore making any changes, it's recommended to create a new branch. This ensures that your changes won't interfere with other contributions and keeps the main branch clean. Use the following command to create and switch to a new branch:\n```bash\n$ git checkout -b branch-name\n```\n### Make the Desired Changes\nNow, you can proceed to make your desired changes to the project. Whether it's fixing bugs, adding new features, improving documentation, or optimizing code, your efforts will be instrumental in enhancing the project.\n\n### Commit and Push Changes\nOnce you have made the necessary changes, commit your work using the following commands:\n```bash\n$ git add .\n$ git commit -m \"Your commit message\"\n```\nPush the changes to your forked repository:\n```bash\n$ git push origin branch-name\n```\n### Submit a Pull Request\nHead over to the [original repository](https://github.com/ibnaleem/mixtral.py) on GitHub and go to the [\"Pull requests\"](https://github.com/ibnaleem/mixtral/pulls) tab.\n1. Click on the \"New pull request\" button.\n2. Select your forked repository and the branch containing your changes.\n3. Provide a clear and informative title for your pull request, and use the description box to explain the modifications you have made. **_Your pull request will be closed if you do not specify the changes you've made._**\n4. Finally, click on the \"Create pull request\" button to submit your changes.\n\n## Verifying Signatures\n### Import PGP Key into Keyring\n```\ngpg --keyserver 185.125.188.27 --recv-keys 20247EC023F2769E66181C0F581B4A2A862BBADE\n```\nor\n```\nwget https://github.com/ibnaleem/ibnaleem/blob/main/public_key.asc\n```\n### Download Signature\nThe signatures (.asc and .sig) can be found in the [/sig directory.](https://github.com/ibnaleem/mixtral.py/tree/main/sig) Download either of them. Open [an issue](https://github.com/ibnaleem/mixtral.py/issues) with the title \"invalid signature/wrong signature/corrupt signature\" for any issues regarding my signatures.\n### Sign My Key \n```\ngpg --sign-key 20247EC023F2769E66181C0F581B4A2A862BBADE\ngpg --send-keys 20247EC023F2769E66181C0F581B4A2A862BBADE\n```\nIf you cannot upload your signature, see [\"*gpg: keyserver receive failed: No route to host*\"](https://stackoverflow.com/questions/54801274/gpg-keyserver-receive-failed-no-route-to-host-stack-overflow)\n### Verify\n```\ngpg --verify mixtral.py.asc mixtral.py\n```\nDesired output:\n```\ngpg: Signature made Tue  6 Feb 10:27:34 2024 GMT\ngpg:                using RSA key 20247EC023F2769E66181C0F581B4A2A862BBADE\ngpg: Good signature from \"Ibn Aleem <ibnaleem@outlook.com>\" [ultimate]\n```\n## Acknowledgements\n- [PyTorch](https://pytorch.org)\n- [Transformers library](https://pypi.org/project/transformers/)\n- [mistralai/Mixtral-8x7B-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A Python module for running the Mixtral-8x7B language model with customisable precision and attention mechanisms.",
    "version": "0.0.1",
    "project_urls": {
        "Homepage": "https://github.com/ibnaleem/mixtral.py"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ca1c2493ec677dc53e7c008ace693f3903b8c495ecbac0257baf8abedee2d05a",
                "md5": "c7aef03c7ab315162436f012c1e2a2ca",
                "sha256": "2a39e76a271a43da43d5b50402a4fe2a9b0f8038fd28b07b4b76531135292cb5"
            },
            "downloads": -1,
            "filename": "mixtral.py-0.0.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c7aef03c7ab315162436f012c1e2a2ca",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 15551,
            "upload_time": "2024-02-06T12:39:58",
            "upload_time_iso_8601": "2024-02-06T12:39:58.153007Z",
            "url": "https://files.pythonhosted.org/packages/ca/1c/2493ec677dc53e7c008ace693f3903b8c495ecbac0257baf8abedee2d05a/mixtral.py-0.0.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "55b82005a6c4f3f9aa86771bdef8eb5fee77cb1b027a2ccaf0a5e6e5e824b631",
                "md5": "cb8dcf6b8b5edc86d750319009f994a0",
                "sha256": "39341daaece0a0fe11e9fcb5911c6ce41030c4d2213be7b7889888f9dc4c782a"
            },
            "downloads": -1,
            "filename": "mixtral.py-0.0.1.tar.gz",
            "has_sig": false,
            "md5_digest": "cb8dcf6b8b5edc86d750319009f994a0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 15638,
            "upload_time": "2024-02-06T12:40:00",
            "upload_time_iso_8601": "2024-02-06T12:40:00.150344Z",
            "url": "https://files.pythonhosted.org/packages/55/b8/2005a6c4f3f9aa86771bdef8eb5fee77cb1b027a2ccaf0a5e6e5e824b631/mixtral.py-0.0.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-06 12:40:00",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ibnaleem",
    "github_project": "mixtral.py",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [],
    "lcname": "mixtral.py"
}
        
Elapsed time: 0.44568s