magic-lamp


Namemagic-lamp JSON
Version 0.4.1 PyPI version JSON
download
home_pagehttps://github.com/rameshvarun/magic-lamp
SummaryEasily integrate LLMs into Python code.
upload_time2024-08-04 05:35:15
maintainerNone
docs_urlNone
authorVarun Ramesh
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # magic-lamp
[![PyPI - Version](https://img.shields.io/pypi/v/magic-lamp)](https://pypi.org/project/magic-lamp/)

Create magic LLM-powered Python functions that return anything you ask for. Many caveats.

## Quickstart

```bash
pip install magic-lamp
```

Define a function with a description and a set of examples.

```python
import magic_lamp

get_atoms = magic_lamp.Function(
    "Break this molecule down into it's constituent atoms. Return as a set.",
    examples=[
        ("water", {"hydrogen", "oxygen"}),
        ("glucose", {"carbon", "hydrogen", "oxygen"}),
    ],
)

print(get_atoms("ammonia")) # => {"nitrogen", "hydrogen"}
```

Functions can return any Python literal (strings, numbers, dicts, tuples lists, etc). No API keys are required, since by default `magic-lamp` downloads and runs a local LLM.

## Configuring the LLM

By default, `magic-lamp` downloads and runs a local LLM from Hugging Face. For more complex tasks, OpenAI models will perform better.

### Using OpenAI

`OPENAI_API_KEY` must be set in the environment. Pass in the name of a `gpt-*` model to the function constructor.

```python
import magic_lamp

format_number = magic_lamp.Function(
    'Write this number out in words.',
    examples=[
        (1, "one"),
        (35, "thirty-five"),
        (15690, "fifteen thousand, six hundred ninety"),
    ],
    model="gpt-4o-mini"
)

print(format_number(328745226793))
```

## Links
- https://github.com/jackmpcollins/magentic - A similar concept but using decorators.
- https://github.com/PrefectHQ/marvin - Antoher similar concept.
- https://github.com/abetlen/llama-cpp-python - Used by this library to run the local LLM.
- https://ai.meta.com/blog/meta-llama-3-1/ - Llama 3.1 8b is the default model.
- https://huggingface.co/bullerwins/Meta-Llama-3.1-8B-Instruct-GGUF - Uses these GGUFs by default.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/rameshvarun/magic-lamp",
    "name": "magic-lamp",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Varun Ramesh",
    "author_email": "varunramesh@live.com",
    "download_url": "https://files.pythonhosted.org/packages/dd/b2/08345d4fddbb974bf8b111fdc2dd80a753e8979de1bd7c5607596ca0f77a/magic_lamp-0.4.1.tar.gz",
    "platform": null,
    "description": "# magic-lamp\n[![PyPI - Version](https://img.shields.io/pypi/v/magic-lamp)](https://pypi.org/project/magic-lamp/)\n\nCreate magic LLM-powered Python functions that return anything you ask for. Many caveats.\n\n## Quickstart\n\n```bash\npip install magic-lamp\n```\n\nDefine a function with a description and a set of examples.\n\n```python\nimport magic_lamp\n\nget_atoms = magic_lamp.Function(\n    \"Break this molecule down into it's constituent atoms. Return as a set.\",\n    examples=[\n        (\"water\", {\"hydrogen\", \"oxygen\"}),\n        (\"glucose\", {\"carbon\", \"hydrogen\", \"oxygen\"}),\n    ],\n)\n\nprint(get_atoms(\"ammonia\")) # => {\"nitrogen\", \"hydrogen\"}\n```\n\nFunctions can return any Python literal (strings, numbers, dicts, tuples lists, etc). No API keys are required, since by default `magic-lamp` downloads and runs a local LLM.\n\n## Configuring the LLM\n\nBy default, `magic-lamp` downloads and runs a local LLM from Hugging Face. For more complex tasks, OpenAI models will perform better.\n\n### Using OpenAI\n\n`OPENAI_API_KEY` must be set in the environment. Pass in the name of a `gpt-*` model to the function constructor.\n\n```python\nimport magic_lamp\n\nformat_number = magic_lamp.Function(\n    'Write this number out in words.',\n    examples=[\n        (1, \"one\"),\n        (35, \"thirty-five\"),\n        (15690, \"fifteen thousand, six hundred ninety\"),\n    ],\n    model=\"gpt-4o-mini\"\n)\n\nprint(format_number(328745226793))\n```\n\n## Links\n- https://github.com/jackmpcollins/magentic - A similar concept but using decorators.\n- https://github.com/PrefectHQ/marvin - Antoher similar concept.\n- https://github.com/abetlen/llama-cpp-python - Used by this library to run the local LLM.\n- https://ai.meta.com/blog/meta-llama-3-1/ - Llama 3.1 8b is the default model.\n- https://huggingface.co/bullerwins/Meta-Llama-3.1-8B-Instruct-GGUF - Uses these GGUFs by default.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Easily integrate LLMs into Python code.",
    "version": "0.4.1",
    "project_urls": {
        "Homepage": "https://github.com/rameshvarun/magic-lamp",
        "Repository": "https://github.com/rameshvarun/magic-lamp"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8d4f3ec727a54b982e065fa150e0d2f6c3c4e0ea4a659d36bcce8fc688709975",
                "md5": "0ca2ad0cbba664289f27196586a01ace",
                "sha256": "ecc215928b03b99b09f3722c215b938e51a0263d8f4f62b6427b7c78a3c87a70"
            },
            "downloads": -1,
            "filename": "magic_lamp-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "0ca2ad0cbba664289f27196586a01ace",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4922,
            "upload_time": "2024-08-04T05:35:13",
            "upload_time_iso_8601": "2024-08-04T05:35:13.884309Z",
            "url": "https://files.pythonhosted.org/packages/8d/4f/3ec727a54b982e065fa150e0d2f6c3c4e0ea4a659d36bcce8fc688709975/magic_lamp-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ddb208345d4fddbb974bf8b111fdc2dd80a753e8979de1bd7c5607596ca0f77a",
                "md5": "923ec0fb1f1ef8636a35e96ee4fdc521",
                "sha256": "234f496ab958c3c221f8090b56a0f6e673c20bd8b3cae072440b5203a48b9f6d"
            },
            "downloads": -1,
            "filename": "magic_lamp-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "923ec0fb1f1ef8636a35e96ee4fdc521",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4460,
            "upload_time": "2024-08-04T05:35:15",
            "upload_time_iso_8601": "2024-08-04T05:35:15.689149Z",
            "url": "https://files.pythonhosted.org/packages/dd/b2/08345d4fddbb974bf8b111fdc2dd80a753e8979de1bd7c5607596ca0f77a/magic_lamp-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-08-04 05:35:15",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "rameshvarun",
    "github_project": "magic-lamp",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "magic-lamp"
}
        
Elapsed time: 0.31089s