mlx-moe


Namemlx-moe JSON
Version 0.0.2 PyPI version JSON
download
home_pagehttps://github.com/yourusername/yourproject
SummaryA tool to generate text with mlx-moe model.
upload_time2024-02-08 16:45:56
maintainer
docs_urlNone
authorYour Name
requires_python>=3.8
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # MLX-MOE Models Package

The `mlx-moe` package offers custom MLX Mixture of Experts (MoE) models, streamlining the process of leveraging sophisticated MoE models for text generation, particularly in conjunction with the `mlx-lm` package. This tool is designed for developers and researchers who require advanced text generation capabilities, providing an easy pathway to utilize and integrate MoE models into their projects.

## Features

- **Custom MoE Model Support**: Easily load and utilize your custom MLX MoE models for text generation.
- **Easy Installation**: Get started quickly with a simple pip installation command.

## Installation

To install `mlx-moe`, simply run the following command in your terminal:

```shell
pip install mlx-moe
```

## Usage

The `mlx-moe` package is designed to be used in conjunction with the mlx-lm package for generating text. After installing mlx-moe, you can load your custom MoE models as follows:
```
from mlx_moe.load import load_moe

model_path = "path_to_your_model"
model = load_moe(model_path)
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/yourusername/yourproject",
    "name": "mlx-moe",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "",
    "author": "Your Name",
    "author_email": "your.email@example.com",
    "download_url": "https://files.pythonhosted.org/packages/d1/2a/7d4a38a3c0d28d9b102efabebdd86c3d8315725065fd048085f97832a739/mlx-moe-0.0.2.tar.gz",
    "platform": null,
    "description": "# MLX-MOE Models Package\n\nThe `mlx-moe` package offers custom MLX Mixture of Experts (MoE) models, streamlining the process of leveraging sophisticated MoE models for text generation, particularly in conjunction with the `mlx-lm` package. This tool is designed for developers and researchers who require advanced text generation capabilities, providing an easy pathway to utilize and integrate MoE models into their projects.\n\n## Features\n\n- **Custom MoE Model Support**: Easily load and utilize your custom MLX MoE models for text generation.\n- **Easy Installation**: Get started quickly with a simple pip installation command.\n\n## Installation\n\nTo install `mlx-moe`, simply run the following command in your terminal:\n\n```shell\npip install mlx-moe\n```\n\n## Usage\n\nThe `mlx-moe` package is designed to be used in conjunction with the mlx-lm package for generating text. After installing mlx-moe, you can load your custom MoE models as follows:\n```\nfrom mlx_moe.load import load_moe\n\nmodel_path = \"path_to_your_model\"\nmodel = load_moe(model_path)\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A tool to generate text with mlx-moe model.",
    "version": "0.0.2",
    "project_urls": {
        "Homepage": "https://github.com/yourusername/yourproject"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "da866ef399585cb36ec3c23150e80642c207f64bd2d8fc1cd2e7942938c272c8",
                "md5": "53eb538ae3e2dfa69b73cdbde691c8b4",
                "sha256": "c49be49dda17ad3f43f807f42e74527c895893885b2149cb1627ed01d384432a"
            },
            "downloads": -1,
            "filename": "mlx_moe-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "53eb538ae3e2dfa69b73cdbde691c8b4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 9489,
            "upload_time": "2024-02-08T16:45:53",
            "upload_time_iso_8601": "2024-02-08T16:45:53.828492Z",
            "url": "https://files.pythonhosted.org/packages/da/86/6ef399585cb36ec3c23150e80642c207f64bd2d8fc1cd2e7942938c272c8/mlx_moe-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d12a7d4a38a3c0d28d9b102efabebdd86c3d8315725065fd048085f97832a739",
                "md5": "6c54132a0e67a01cf8b14e018cd0d4d5",
                "sha256": "47349ce7f47a9623955df86ecbc8f9419c168d74f0f87dcd478b3f644713472d"
            },
            "downloads": -1,
            "filename": "mlx-moe-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "6c54132a0e67a01cf8b14e018cd0d4d5",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 7974,
            "upload_time": "2024-02-08T16:45:56",
            "upload_time_iso_8601": "2024-02-08T16:45:56.095300Z",
            "url": "https://files.pythonhosted.org/packages/d1/2a/7d4a38a3c0d28d9b102efabebdd86c3d8315725065fd048085f97832a739/mlx-moe-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-08 16:45:56",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "yourusername",
    "github_project": "yourproject",
    "github_not_found": true,
    "lcname": "mlx-moe"
}
        
Elapsed time: 0.17380s