llm-smollm2


Namellm-smollm2 JSON
Version 0.1.2 PyPI version JSON
download
home_pageNone
SummarySmolLM2-135M-Instruct.Q4_1 for LLM
upload_time2025-02-07 07:03:31
maintainerNone
docs_urlNone
authorSimon Willison
requires_python>=3.9
licenseApache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # llm-smollm2

[![PyPI](https://img.shields.io/pypi/v/llm-smollm2.svg)](https://pypi.org/project/llm-smollm2/)
[![Changelog](https://img.shields.io/github/v/release/simonw/llm-smollm2?include_prereleases&label=changelog)](https://github.com/simonw/llm-smollm2/releases)
[![Tests](https://github.com/simonw/llm-smollm2/actions/workflows/test.yml/badge.svg)](https://github.com/simonw/llm-smollm2/actions/workflows/test.yml)
[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-smollm2/blob/main/LICENSE)

SmolLM2-135M-Instruct.Q4_1 for LLM. [Background on this project](https://simonwillison.net/2025/Feb/7/pip-install-llm-smollm2/).

## Installation

Install this plugin in the same environment as [LLM](https://llm.datasette.io/).
```bash
llm install llm-smollm2
```
If you have [uv](https://github.com/astral-sh/uv) installed you can chat with the model without any installation step like this:
```bash
uvx --with llm-smollm2 llm chat -m SmolLM2
```
## Usage

This plugin bundles a full copy of the [SmolLM2-135M-Instruct.Q4_1](https://huggingface.co/QuantFactory/SmolLM2-135M-Instruct-GGUF/blob/ab810cf68114990406fdf996510dd3d3c6adbdf5/SmolLM2-135M-Instruct.Q4_1.gguf) quantized version of the [SmolLM2-135M-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct) model by [HuggingFaceTB](https://huggingface.co/HuggingFaceTB).

Once installed, run the model like this:
```bash
llm -m SmolLM2 'Are dogs real?'
```
Or to chat with the model (keeping it resident in memory):
```bash
llm chat -m SmolLM2
```

## Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:
```bash
cd llm-smollm2
python -m venv venv
source venv/bin/activate
```
Now install the dependencies and test dependencies:
```bash
llm install -e '.[test]'
```
To run the tests:
```bash
python -m pytest
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llm-smollm2",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Simon Willison",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/ca/76/0f6954a442e68f33741857f0aa56a3a412faea2302a116322be9d4d22fdf/llm_smollm2-0.1.2.tar.gz",
    "platform": null,
    "description": "# llm-smollm2\n\n[![PyPI](https://img.shields.io/pypi/v/llm-smollm2.svg)](https://pypi.org/project/llm-smollm2/)\n[![Changelog](https://img.shields.io/github/v/release/simonw/llm-smollm2?include_prereleases&label=changelog)](https://github.com/simonw/llm-smollm2/releases)\n[![Tests](https://github.com/simonw/llm-smollm2/actions/workflows/test.yml/badge.svg)](https://github.com/simonw/llm-smollm2/actions/workflows/test.yml)\n[![License](https://img.shields.io/badge/license-Apache%202.0-blue.svg)](https://github.com/simonw/llm-smollm2/blob/main/LICENSE)\n\nSmolLM2-135M-Instruct.Q4_1 for LLM. [Background on this project](https://simonwillison.net/2025/Feb/7/pip-install-llm-smollm2/).\n\n## Installation\n\nInstall this plugin in the same environment as [LLM](https://llm.datasette.io/).\n```bash\nllm install llm-smollm2\n```\nIf you have [uv](https://github.com/astral-sh/uv) installed you can chat with the model without any installation step like this:\n```bash\nuvx --with llm-smollm2 llm chat -m SmolLM2\n```\n## Usage\n\nThis plugin bundles a full copy of the [SmolLM2-135M-Instruct.Q4_1](https://huggingface.co/QuantFactory/SmolLM2-135M-Instruct-GGUF/blob/ab810cf68114990406fdf996510dd3d3c6adbdf5/SmolLM2-135M-Instruct.Q4_1.gguf) quantized version of the [SmolLM2-135M-Instruct](https://huggingface.co/HuggingFaceTB/SmolLM2-135M-Instruct) model by [HuggingFaceTB](https://huggingface.co/HuggingFaceTB).\n\nOnce installed, run the model like this:\n```bash\nllm -m SmolLM2 'Are dogs real?'\n```\nOr to chat with the model (keeping it resident in memory):\n```bash\nllm chat -m SmolLM2\n```\n\n## Development\n\nTo set up this plugin locally, first checkout the code. Then create a new virtual environment:\n```bash\ncd llm-smollm2\npython -m venv venv\nsource venv/bin/activate\n```\nNow install the dependencies and test dependencies:\n```bash\nllm install -e '.[test]'\n```\nTo run the tests:\n```bash\npython -m pytest\n```\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "SmolLM2-135M-Instruct.Q4_1 for LLM",
    "version": "0.1.2",
    "project_urls": {
        "CI": "https://github.com/simonw/llm-smollm2/actions",
        "Changelog": "https://github.com/simonw/llm-smollm2/releases",
        "Homepage": "https://github.com/simonw/llm-smollm2",
        "Issues": "https://github.com/simonw/llm-smollm2/issues"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "06be9df8343f073e455d94f4acda2c211e84dec4d2879ee959c8dfbaa40d7d1b",
                "md5": "256234ff71d0c933d0cec42b316cd6fc",
                "sha256": "bcc81830d10ce7d9e76640cad826a4b79ed3e4547c78a0be5c4f2fb0e2448c70"
            },
            "downloads": -1,
            "filename": "llm_smollm2-0.1.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "256234ff71d0c933d0cec42b316cd6fc",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.9",
            "size": 92871040,
            "upload_time": "2025-02-07T07:03:18",
            "upload_time_iso_8601": "2025-02-07T07:03:18.840953Z",
            "url": "https://files.pythonhosted.org/packages/06/be/9df8343f073e455d94f4acda2c211e84dec4d2879ee959c8dfbaa40d7d1b/llm_smollm2-0.1.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "ca760f6954a442e68f33741857f0aa56a3a412faea2302a116322be9d4d22fdf",
                "md5": "d999af48956749b3924c34d9c231e68f",
                "sha256": "97a14f7fbec7a4612c99eb8d1d458b2763e189f082220d36c88d2eeaed153469"
            },
            "downloads": -1,
            "filename": "llm_smollm2-0.1.2.tar.gz",
            "has_sig": false,
            "md5_digest": "d999af48956749b3924c34d9c231e68f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.9",
            "size": 92863474,
            "upload_time": "2025-02-07T07:03:31",
            "upload_time_iso_8601": "2025-02-07T07:03:31.520188Z",
            "url": "https://files.pythonhosted.org/packages/ca/76/0f6954a442e68f33741857f0aa56a3a412faea2302a116322be9d4d22fdf/llm_smollm2-0.1.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-07 07:03:31",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "simonw",
    "github_project": "llm-smollm2",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "llm-smollm2"
}
        
Elapsed time: 0.92128s