# llm-mpt30b
[![PyPI](https://img.shields.io/pypi/v/llm-mpt30b.svg)](https://pypi.org/project/llm-mpt30b/)
[![Changelog](https://img.shields.io/github/v/release/simonw/llm-mpt30b?include_prereleases&label=changelog)](https://github.com/simonw/llm-mpt30b/releases)
[![Tests](https://github.com/simonw/llm-mpt30b/workflows/Test/badge.svg)](https://github.com/simonw/llm-mpt30b/actions?query=workflow%3ATest)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-mpt30b/blob/main/LICENSE)
Plugin for [LLM](https://llm.datasette.io/) adding support for the [MPT-30B language model](https://huggingface.co/mosaicml/mpt-30b).
This plugin uses [TheBloke/mpt-30B-GGML](https://huggingface.co/TheBloke/mpt-30B-GGML). The code was inspired by [abacaj/mpt-30B-inference](https://github.com/abacaj/mpt-30B-inference).
## Installation
Install this plugin in the same environment as LLM.
```bash
llm install llm-mpt30b
```
After installing the plugin you will need to download the ~19GB model file. You can do this by running:
```bash
llm mpt30b download
```
## Usage
This plugin adds a model called `mpt30b`. You can execute it like this:
```bash
llm -m mpt30b "Three great names for a pet goat"
```
The alias `-m mpt` works as well.
You can pass the option `-o verbose 1` to see more verbose output - currently a progress bar showing any additional downloads that are made during execution.
## Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-mpt30b
python3 -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
pip install -e '.[test]'
To run the tests:
pytest
Raw data
{
"_id": null,
"home_page": "https://github.com/simonw/llm-mpt30b",
"name": "llm-mpt30b",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": "",
"keywords": "",
"author": "Simon Willison",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/3c/89/d53f4453f8d1e729c214d7fd293e705778e40df05734d0dac5aac7cd5ded/llm-mpt30b-0.1.tar.gz",
"platform": null,
"description": "# llm-mpt30b\n\n[![PyPI](https://img.shields.io/pypi/v/llm-mpt30b.svg)](https://pypi.org/project/llm-mpt30b/)\n[![Changelog](https://img.shields.io/github/v/release/simonw/llm-mpt30b?include_prereleases&label=changelog)](https://github.com/simonw/llm-mpt30b/releases)\n[![Tests](https://github.com/simonw/llm-mpt30b/workflows/Test/badge.svg)](https://github.com/simonw/llm-mpt30b/actions?query=workflow%3ATest)\n[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-mpt30b/blob/main/LICENSE)\n\nPlugin for [LLM](https://llm.datasette.io/) adding support for the [MPT-30B language model](https://huggingface.co/mosaicml/mpt-30b).\n\nThis plugin uses [TheBloke/mpt-30B-GGML](https://huggingface.co/TheBloke/mpt-30B-GGML). The code was inspired by [abacaj/mpt-30B-inference](https://github.com/abacaj/mpt-30B-inference).\n\n## Installation\n\nInstall this plugin in the same environment as LLM.\n```bash\nllm install llm-mpt30b\n```\nAfter installing the plugin you will need to download the ~19GB model file. You can do this by running:\n\n```bash\nllm mpt30b download\n```\n\n## Usage\n\nThis plugin adds a model called `mpt30b`. You can execute it like this:\n\n```bash\nllm -m mpt30b \"Three great names for a pet goat\"\n```\nThe alias `-m mpt` works as well.\n\nYou can pass the option `-o verbose 1` to see more verbose output - currently a progress bar showing any additional downloads that are made during execution.\n\n## Development\n\nTo set up this plugin locally, first checkout the code. Then create a new virtual environment:\n\n cd llm-mpt30b\n python3 -m venv venv\n source venv/bin/activate\n\nNow install the dependencies and test dependencies:\n\n pip install -e '.[test]'\n\nTo run the tests:\n\n pytest\n",
"bugtrack_url": null,
"license": "Apache License, Version 2.0",
"summary": "Plugin for LLM adding support for the MPT-30B language model",
"version": "0.1",
"project_urls": {
"CI": "https://github.com/simonw/llm-mpt30b/actions",
"Changelog": "https://github.com/simonw/llm-mpt30b/releases",
"Homepage": "https://github.com/simonw/llm-mpt30b",
"Issues": "https://github.com/simonw/llm-mpt30b/issues"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "dba8a23ce0d1e970854101f286d29bc4faf3bf1f6d57964ec86d64738097b17d",
"md5": "2f357f0f5643fcbeeefeafc21eb66a97",
"sha256": "6d30f782f7033407501900380b3b058b0c3d47cafa477e705905de0fc2ac19bc"
},
"downloads": -1,
"filename": "llm_mpt30b-0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "2f357f0f5643fcbeeefeafc21eb66a97",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 7885,
"upload_time": "2023-07-12T14:39:48",
"upload_time_iso_8601": "2023-07-12T14:39:48.593481Z",
"url": "https://files.pythonhosted.org/packages/db/a8/a23ce0d1e970854101f286d29bc4faf3bf1f6d57964ec86d64738097b17d/llm_mpt30b-0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3c89d53f4453f8d1e729c214d7fd293e705778e40df05734d0dac5aac7cd5ded",
"md5": "22a6c41a035da017a2efc956295145a8",
"sha256": "c7524a4564827c5d41fa1c739f761f922a5dc35c98ee8c33174167b56877caf3"
},
"downloads": -1,
"filename": "llm-mpt30b-0.1.tar.gz",
"has_sig": false,
"md5_digest": "22a6c41a035da017a2efc956295145a8",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 8052,
"upload_time": "2023-07-12T14:39:50",
"upload_time_iso_8601": "2023-07-12T14:39:50.483078Z",
"url": "https://files.pythonhosted.org/packages/3c/89/d53f4453f8d1e729c214d7fd293e705778e40df05734d0dac5aac7cd5ded/llm-mpt30b-0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-07-12 14:39:50",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "simonw",
"github_project": "llm-mpt30b",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "llm-mpt30b"
}