lasagna-ai


Namelasagna-ai JSON
Version 0.10.0 PyPI version JSON
download
home_pageNone
SummaryLayered agents!
upload_time2024-11-09 20:20:27
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords agent agents ai hierarchical layered layers llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![Lasagna AI Logo](https://raw.githubusercontent.com/Rhobota/lasagna-ai/main/logos/lasagna-ai.png)

# Lasagna AI

[![PyPI - Version](https://img.shields.io/pypi/v/lasagna-ai.svg)](https://pypi.org/project/lasagna-ai)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/lasagna-ai.svg)](https://pypi.org/project/lasagna-ai)
![Test Status](https://github.com/Rhobota/lasagna-ai/actions/workflows/test.yml/badge.svg?branch=main)
[![Downloads](https://static.pepy.tech/badge/lasagna-ai)](https://pepy.tech/project/lasagna-ai)

- 🥞  **Layered agents!**
  - Agents for your agents!
  - Tool-use, structured output ("extraction"), and layering FTW 💪
  - Ever wanted a _recursive_ agent? Now you can have one! 🤯
  - _Parallel_ tool-calling by default.
  - Fully asyncio.
  - 100% Python type hints.
  - Functional-style 😎
  - (optional) Easy & pluggable caching! 🏦

- 🚣  **Streamable!**
  - Event streams for _everything_.
  - Asyncio generators are awesome.

- 🗃️ **Easy database integration!**
  - Don't rage when trying to store raw messages and token counts. 😡 🤬
  - Yes, you _can_ have _both_ streaming and easy database storage.

- ↔️ **Provider/model agnostic and interoperable!**
  - Native support for [OpenAI](https://platform.openai.com/docs/models), [Anthropic](https://docs.anthropic.com/en/docs/welcome), [NVIDIA NIM/NGC](https://build.nvidia.com/explore/reasoning) (+ more to come).
  - Message representations are canonized. 😇
  - Supports vision!
  - Easily build committees!
  - Swap providers or models mid-conversation.
  - Delegate tasks among model providers or model sizes.
  - Parallelize all the things.

-----

## Table of Contents

- [Installation](#installation)
- [Used By](#used-by)
- [Quickstart](#quickstart)
- [Debug Logging](#debug-logging)
- [Special Thanks](#special-thanks)
- [License](#license)

## Installation

```console
pip install -U lasagna-ai[openai,anthropic]
```

If you want to easily run all the [./examples](./examples), then you can
install the extra dependencies used by those examples:

```console
pip install -U lasagna-ai[openai,anthropic,example-deps]
```

## Used By

Lasagna is used in production by:

[![AutoAuto](https://raw.githubusercontent.com/Rhobota/lasagna-ai/main/logos/autoauto.png)](https://www.autoauto.ai/)

## Quickstart

Here is the _most simple_ agent (it doesn't add *anything* to the underlying model).
More complex agents would add tools and/or use layers of agents, but not this one!
Anyway, run it in your terminal and you can chat interactively with the model. 🤩

(taken from [./examples/quickstart.py](./examples/quickstart.py))

```python
from lasagna import (
    known_models,
    build_simple_agent,
)

from lasagna.tui import (
    tui_input_loop,
)

from typing import List, Callable

import asyncio

from dotenv import load_dotenv; load_dotenv()


MODEL_BINDER = known_models.BIND_OPENAI_gpt_4o_mini()


async def main() -> None:
    system_prompt = "You are grumpy."
    tools: List[Callable] = []
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    my_bound_agent = MODEL_BINDER(my_agent)
    await tui_input_loop(my_bound_agent, system_prompt)


if __name__ == '__main__':
    asyncio.run(main())
```

**Want to add your first tool?** LLMs can't natively do arithmetic
(beyond simple arithmetic with small numbers), so let's give our
model a tool for doing arithmetic! 😎

(full example at [./examples/quickstart_with_math_tool.py](./examples/quickstart_with_math_tool.py))

```python
import sympy as sp

...

def evaluate_math_expression(expression: str) -> float:
    """
    This tool evaluates a math expression and returns the result.
    Pass math expression as a string, for example:
     - "3 * 6 + 1"
     - "cos(2 * pi / 3) + log(8)"
     - "(4.5/2) + (6.3/1.2)"
     - ... etc
    :param: expression: str: the math expression to evaluate
    """
    expr = sp.sympify(expression)
    result = float(expr.evalf())
    return result

...

    ...
    tools: List[Callable] = [
        evaluate_math_expression,
    ]
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    ...

...
```

**Simple RAG:** Everyone's favorite tool: _Retrieval Augmented Generation_ (RAG). Let's GO! 📚💨  
See: [./examples/demo_rag.py](./examples/demo_rag.py)

## Debug Logging

This library logs using Python's builtin `logging` module. It logs mostly to `INFO`, so here's a snippet of code you can put in _your_ app to see those traces:

```python
import logging

logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
)

# ... now use Lasagna as you normally would, but you'll see extra log traces!
```

## Special Thanks

Special thanks to those who inspired this library:
- Numa Dhamani (buy her book: [Introduction to Generative AI](https://a.co/d/03dHnRmX))
- Dave DeCaprio's [voice-stream library](https://github.com/DaveDeCaprio/voice-stream)

## License

`lasagna-ai` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license.

## Joke Acronym

Layered Agents with toolS And aGeNts and Ai

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "lasagna-ai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "agent, agents, ai, hierarchical, layered, layers, llm",
    "author": null,
    "author_email": "Ryan Henning <ryan@rhobota.com>",
    "download_url": "https://files.pythonhosted.org/packages/df/c0/ca75f3196f414b7a4b7b32512bcc92b9e9d38100118fd82ac50651e72b94/lasagna_ai-0.10.0.tar.gz",
    "platform": null,
    "description": "![Lasagna AI Logo](https://raw.githubusercontent.com/Rhobota/lasagna-ai/main/logos/lasagna-ai.png)\n\n# Lasagna AI\n\n[![PyPI - Version](https://img.shields.io/pypi/v/lasagna-ai.svg)](https://pypi.org/project/lasagna-ai)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/lasagna-ai.svg)](https://pypi.org/project/lasagna-ai)\n![Test Status](https://github.com/Rhobota/lasagna-ai/actions/workflows/test.yml/badge.svg?branch=main)\n[![Downloads](https://static.pepy.tech/badge/lasagna-ai)](https://pepy.tech/project/lasagna-ai)\n\n- \ud83e\udd5e  **Layered agents!**\n  - Agents for your agents!\n  - Tool-use, structured output (\"extraction\"), and layering FTW \ud83d\udcaa\n  - Ever wanted a _recursive_ agent? Now you can have one! \ud83e\udd2f\n  - _Parallel_ tool-calling by default.\n  - Fully asyncio.\n  - 100% Python type hints.\n  - Functional-style \ud83d\ude0e\n  - (optional) Easy & pluggable caching! \ud83c\udfe6\n\n- \ud83d\udea3  **Streamable!**\n  - Event streams for _everything_.\n  - Asyncio generators are awesome.\n\n- \ud83d\uddc3\ufe0f **Easy database integration!**\n  - Don't rage when trying to store raw messages and token counts. \ud83d\ude21 \ud83e\udd2c\n  - Yes, you _can_ have _both_ streaming and easy database storage.\n\n- \u2194\ufe0f **Provider/model agnostic and interoperable!**\n  - Native support for [OpenAI](https://platform.openai.com/docs/models), [Anthropic](https://docs.anthropic.com/en/docs/welcome), [NVIDIA NIM/NGC](https://build.nvidia.com/explore/reasoning) (+ more to come).\n  - Message representations are canonized. \ud83d\ude07\n  - Supports vision!\n  - Easily build committees!\n  - Swap providers or models mid-conversation.\n  - Delegate tasks among model providers or model sizes.\n  - Parallelize all the things.\n\n-----\n\n## Table of Contents\n\n- [Installation](#installation)\n- [Used By](#used-by)\n- [Quickstart](#quickstart)\n- [Debug Logging](#debug-logging)\n- [Special Thanks](#special-thanks)\n- [License](#license)\n\n## Installation\n\n```console\npip install -U lasagna-ai[openai,anthropic]\n```\n\nIf you want to easily run all the [./examples](./examples), then you can\ninstall the extra dependencies used by those examples:\n\n```console\npip install -U lasagna-ai[openai,anthropic,example-deps]\n```\n\n## Used By\n\nLasagna is used in production by:\n\n[![AutoAuto](https://raw.githubusercontent.com/Rhobota/lasagna-ai/main/logos/autoauto.png)](https://www.autoauto.ai/)\n\n## Quickstart\n\nHere is the _most simple_ agent (it doesn't add *anything* to the underlying model).\nMore complex agents would add tools and/or use layers of agents, but not this one!\nAnyway, run it in your terminal and you can chat interactively with the model. \ud83e\udd29\n\n(taken from [./examples/quickstart.py](./examples/quickstart.py))\n\n```python\nfrom lasagna import (\n    known_models,\n    build_simple_agent,\n)\n\nfrom lasagna.tui import (\n    tui_input_loop,\n)\n\nfrom typing import List, Callable\n\nimport asyncio\n\nfrom dotenv import load_dotenv; load_dotenv()\n\n\nMODEL_BINDER = known_models.BIND_OPENAI_gpt_4o_mini()\n\n\nasync def main() -> None:\n    system_prompt = \"You are grumpy.\"\n    tools: List[Callable] = []\n    my_agent = build_simple_agent(name = 'agent', tools = tools)\n    my_bound_agent = MODEL_BINDER(my_agent)\n    await tui_input_loop(my_bound_agent, system_prompt)\n\n\nif __name__ == '__main__':\n    asyncio.run(main())\n```\n\n**Want to add your first tool?** LLMs can't natively do arithmetic\n(beyond simple arithmetic with small numbers), so let's give our\nmodel a tool for doing arithmetic! \ud83d\ude0e\n\n(full example at [./examples/quickstart_with_math_tool.py](./examples/quickstart_with_math_tool.py))\n\n```python\nimport sympy as sp\n\n...\n\ndef evaluate_math_expression(expression: str) -> float:\n    \"\"\"\n    This tool evaluates a math expression and returns the result.\n    Pass math expression as a string, for example:\n     - \"3 * 6 + 1\"\n     - \"cos(2 * pi / 3) + log(8)\"\n     - \"(4.5/2) + (6.3/1.2)\"\n     - ... etc\n    :param: expression: str: the math expression to evaluate\n    \"\"\"\n    expr = sp.sympify(expression)\n    result = float(expr.evalf())\n    return result\n\n...\n\n    ...\n    tools: List[Callable] = [\n        evaluate_math_expression,\n    ]\n    my_agent = build_simple_agent(name = 'agent', tools = tools)\n    ...\n\n...\n```\n\n**Simple RAG:** Everyone's favorite tool: _Retrieval Augmented Generation_ (RAG). Let's GO! \ud83d\udcda\ud83d\udca8  \nSee: [./examples/demo_rag.py](./examples/demo_rag.py)\n\n## Debug Logging\n\nThis library logs using Python's builtin `logging` module. It logs mostly to `INFO`, so here's a snippet of code you can put in _your_ app to see those traces:\n\n```python\nimport logging\n\nlogging.basicConfig(\n    level=logging.INFO,\n    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',\n)\n\n# ... now use Lasagna as you normally would, but you'll see extra log traces!\n```\n\n## Special Thanks\n\nSpecial thanks to those who inspired this library:\n- Numa Dhamani (buy her book: [Introduction to Generative AI](https://a.co/d/03dHnRmX))\n- Dave DeCaprio's [voice-stream library](https://github.com/DaveDeCaprio/voice-stream)\n\n## License\n\n`lasagna-ai` is distributed under the terms of the [MIT](https://spdx.org/licenses/MIT.html) license.\n\n## Joke Acronym\n\nLayered Agents with toolS And aGeNts and Ai\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Layered agents!",
    "version": "0.10.0",
    "project_urls": {
        "Documentation": "https://github.com/Rhobota/lasagna-ai#readme",
        "Issues": "https://github.com/Rhobota/lasagna-ai/issues",
        "Source": "https://github.com/Rhobota/lasagna-ai"
    },
    "split_keywords": [
        "agent",
        " agents",
        " ai",
        " hierarchical",
        " layered",
        " layers",
        " llm"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a177a7db05e72d28b5d626636ce592fc3aefcf332eddd9c1648ab731be3b8ff7",
                "md5": "71d7406b6163767070e653f11035a813",
                "sha256": "afe923931fc5ea827114a98e68577a60a6b9ca71d0a78e03c72c6e9b0788719e"
            },
            "downloads": -1,
            "filename": "lasagna_ai-0.10.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "71d7406b6163767070e653f11035a813",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 41536,
            "upload_time": "2024-11-09T20:20:26",
            "upload_time_iso_8601": "2024-11-09T20:20:26.945846Z",
            "url": "https://files.pythonhosted.org/packages/a1/77/a7db05e72d28b5d626636ce592fc3aefcf332eddd9c1648ab731be3b8ff7/lasagna_ai-0.10.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "dfc0ca75f3196f414b7a4b7b32512bcc92b9e9d38100118fd82ac50651e72b94",
                "md5": "d1e9a3f6452805458b70cad50dc7ada4",
                "sha256": "aec59a641da47fe8704f98c08aaa849d7b87a62a511fc0316090f43eb1579a02"
            },
            "downloads": -1,
            "filename": "lasagna_ai-0.10.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d1e9a3f6452805458b70cad50dc7ada4",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 99162,
            "upload_time": "2024-11-09T20:20:27",
            "upload_time_iso_8601": "2024-11-09T20:20:27.965143Z",
            "url": "https://files.pythonhosted.org/packages/df/c0/ca75f3196f414b7a4b7b32512bcc92b9e9d38100118fd82ac50651e72b94/lasagna_ai-0.10.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-09 20:20:27",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Rhobota",
    "github_project": "lasagna-ai#readme",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "lasagna-ai"
}
        
Elapsed time: 0.51214s