funcchain


Namefuncchain JSON
Version 0.3.3 PyPI version JSON
download
home_pageNone
Summary🔖 write prompts as python functions
upload_time2024-04-20 10:54:50
maintainerNone
docs_urlNone
authorNone
requires_python<3.13,>=3.10
licenseNone
keywords agent framework ai cognitive systems funcchain langchain llm pydantic pythonic
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# funcchain

<!-- markdownlint-disable MD033 -->
[![Version](https://badge.fury.io/py/funcchain.svg)](https://badge.fury.io/py/funcchain)
[![tests](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml/badge.svg)](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml)
![PyVersion](https://img.shields.io/pypi/pyversions/funcchain)
![License](https://img.shields.io/github/license/shroominic/funcchain)
![Downloads](https://img.shields.io/pypi/dm/funcchain)
[![Discord](https://img.shields.io/discord/1192334452110659664?label=discord)](https://discord.gg/TrwWWMXdtR)
<img alt="GitHub Contributors" src="https://img.shields.io/github/contributors/shroominic/funcchain" />
<img alt="GitHub Last Commit" src="https://img.shields.io/github/last-commit/shroominic/funcchain" />
[![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://docs.pydantic.dev/latest/contributing/#badges)
[![Twitter Follow](https://img.shields.io/twitter/follow/shroominic?style=social)](https://x.com/shroominic)
<!-- <img alt="Repo Size" src="https://img.shields.io/github/repo-size/shroominic/funcchain" /> -->

```bash
pip install funcchain
```

## Introduction

`funcchain` is the *most pythonic* way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.
It utilizes OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.
In the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.

[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/ricklamers/funcchain-demo)

## Simple Demo

```python
from funcchain import chain
from pydantic import BaseModel

# define your output shape
class Recipe(BaseModel):
    ingredients: list[str]
    instructions: list[str]
    duration: int

# write prompts utilising all native python features
def generate_recipe(topic: str) -> Recipe:
    """
    Generate a recipe for a given topic.
    """
    return chain() # <- this is doing all the magic

# generate llm response
recipe = generate_recipe("christmas dinner")

# recipe is automatically converted as pydantic model
print(recipe.ingredients)
```

## Complex Structured Output

```python
from pydantic import BaseModel, Field
from funcchain import chain

# define nested models
class Item(BaseModel):
    name: str = Field(description="Name of the item")
    description: str = Field(description="Description of the item")
    keywords: list[str] = Field(description="Keywords for the item")

class ShoppingList(BaseModel):
    items: list[Item]
    store: str = Field(description="The store to buy the items from")

class TodoList(BaseModel):
    todos: list[Item]
    urgency: int = Field(description="The urgency of all tasks (1-10)")

# support for union types
def extract_list(user_input: str) -> TodoList | ShoppingList:
    """
    The user input is either a shopping List or a todo list.
    """
    return chain()

# the model will choose the output type automatically
lst = extract_list(
    input("Enter your list: ")
)

# custom handler based on type
match lst:
    case ShoppingList(items=items, store=store):
        print("Here is your Shopping List: ")
        for item in items:
            print(f"{item.name}: {item.description}")
        print(f"You need to go to: {store}")

    case TodoList(todos=todos, urgency=urgency):
        print("Here is your Todo List: ")
        for item in todos:
            print(f"{item.name}: {item.description}")
        print(f"Urgency: {urgency}")
```

## Vision Models

```python
from funcchain import Image
from pydantic import BaseModel, Field
from funcchain import chain, settings

# set global llm using model identifiers (see MODELS.md)
settings.llm = "openai/gpt-4-vision-preview"

# everything defined is part of the prompt
class AnalysisResult(BaseModel):
    """The result of an image analysis."""

    theme: str = Field(description="The theme of the image")
    description: str = Field(description="A description of the image")
    objects: list[str] = Field(description="A list of objects found in the image")

# easy use of images as input with structured output
def analyse_image(image: Image) -> AnalysisResult:
    """
    Analyse the image and extract its
    theme, description and objects.
    """
    return chain()

result = analyse_image(Image.open("examples/assets/old_chinese_temple.jpg"))

print("Theme:", result.theme)
print("Description:", result.description)
for obj in result.objects:
    print("Found this object:", obj)
```

## Seamless local model support

```python
from pydantic import BaseModel, Field
from funcchain import chain, settings

# auto-download the model from huggingface
settings.llm = "ollama/openchat"

class SentimentAnalysis(BaseModel):
    analysis: str
    sentiment: bool = Field(description="True for Happy, False for Sad")

def analyze(text: str) -> SentimentAnalysis:
    """
    Determines the sentiment of the text.
    """
    return chain()

# generates using the local model
poem = analyze("I really like when my dog does a trick!")

# promised structured output (for local models!)
print(poem.analysis)
```

## Features

- 🐍 pythonic
- 🔀 easy swap between openai or local models
- 🔄 dynamic output types (pydantic models, or primitives)
- 👁️ vision llm support
- 🧠 langchain_core as backend
- 📝 jinja templating for prompts
- 🏗️ reliable structured output
- 🔁 auto retry parsing
- 🔧 langsmith support
- 🔄 sync, async, streaming, parallel, fallbacks
- 📦 gguf download from huggingface
- ✅ type hints for all functions and mypy support
- 🗣️ chat router component
- 🧩 composable with langchain LCEL
- 🛠️ easy error handling
- 🚦 enums and literal support
- 📐 custom parsing types

## Documentation

[Checkout the docs here](https://shroominic.github.io/funcchain/) 👈

Also highly recommend to try and run the examples in the `./examples` folder.

## Contribution

You want to contribute? Thanks, that's great!
For more information checkout the [Contributing Guide](docs/contributing/dev-setup.md).
Please run the dev setup to get started:

```bash
git clone https://github.com/shroominic/funcchain.git && cd funcchain

./dev_setup.sh
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "funcchain",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<3.13,>=3.10",
    "maintainer_email": null,
    "keywords": "agent framework, ai, cognitive systems, funcchain, langchain, llm, pydantic, pythonic",
    "author": null,
    "author_email": "Shroominic <contact@shroominic.com>",
    "download_url": "https://files.pythonhosted.org/packages/e6/8d/69e689cf0462aa5aa43c13f706bde8057d38ed68c7b106d2428436adf1c0/funcchain-0.3.3.tar.gz",
    "platform": null,
    "description": "\n# funcchain\n\n<!-- markdownlint-disable MD033 -->\n[![Version](https://badge.fury.io/py/funcchain.svg)](https://badge.fury.io/py/funcchain)\n[![tests](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml/badge.svg)](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml)\n![PyVersion](https://img.shields.io/pypi/pyversions/funcchain)\n![License](https://img.shields.io/github/license/shroominic/funcchain)\n![Downloads](https://img.shields.io/pypi/dm/funcchain)\n[![Discord](https://img.shields.io/discord/1192334452110659664?label=discord)](https://discord.gg/TrwWWMXdtR)\n<img alt=\"GitHub Contributors\" src=\"https://img.shields.io/github/contributors/shroominic/funcchain\" />\n<img alt=\"GitHub Last Commit\" src=\"https://img.shields.io/github/last-commit/shroominic/funcchain\" />\n[![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://docs.pydantic.dev/latest/contributing/#badges)\n[![Twitter Follow](https://img.shields.io/twitter/follow/shroominic?style=social)](https://x.com/shroominic)\n<!-- <img alt=\"Repo Size\" src=\"https://img.shields.io/github/repo-size/shroominic/funcchain\" /> -->\n\n```bash\npip install funcchain\n```\n\n## Introduction\n\n`funcchain` is the *most pythonic* way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.\nIt utilizes OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.\nIn the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.\n\n[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/ricklamers/funcchain-demo)\n\n## Simple Demo\n\n```python\nfrom funcchain import chain\nfrom pydantic import BaseModel\n\n# define your output shape\nclass Recipe(BaseModel):\n    ingredients: list[str]\n    instructions: list[str]\n    duration: int\n\n# write prompts utilising all native python features\ndef generate_recipe(topic: str) -> Recipe:\n    \"\"\"\n    Generate a recipe for a given topic.\n    \"\"\"\n    return chain() # <- this is doing all the magic\n\n# generate llm response\nrecipe = generate_recipe(\"christmas dinner\")\n\n# recipe is automatically converted as pydantic model\nprint(recipe.ingredients)\n```\n\n## Complex Structured Output\n\n```python\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain\n\n# define nested models\nclass Item(BaseModel):\n    name: str = Field(description=\"Name of the item\")\n    description: str = Field(description=\"Description of the item\")\n    keywords: list[str] = Field(description=\"Keywords for the item\")\n\nclass ShoppingList(BaseModel):\n    items: list[Item]\n    store: str = Field(description=\"The store to buy the items from\")\n\nclass TodoList(BaseModel):\n    todos: list[Item]\n    urgency: int = Field(description=\"The urgency of all tasks (1-10)\")\n\n# support for union types\ndef extract_list(user_input: str) -> TodoList | ShoppingList:\n    \"\"\"\n    The user input is either a shopping List or a todo list.\n    \"\"\"\n    return chain()\n\n# the model will choose the output type automatically\nlst = extract_list(\n    input(\"Enter your list: \")\n)\n\n# custom handler based on type\nmatch lst:\n    case ShoppingList(items=items, store=store):\n        print(\"Here is your Shopping List: \")\n        for item in items:\n            print(f\"{item.name}: {item.description}\")\n        print(f\"You need to go to: {store}\")\n\n    case TodoList(todos=todos, urgency=urgency):\n        print(\"Here is your Todo List: \")\n        for item in todos:\n            print(f\"{item.name}: {item.description}\")\n        print(f\"Urgency: {urgency}\")\n```\n\n## Vision Models\n\n```python\nfrom funcchain import Image\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain, settings\n\n# set global llm using model identifiers (see MODELS.md)\nsettings.llm = \"openai/gpt-4-vision-preview\"\n\n# everything defined is part of the prompt\nclass AnalysisResult(BaseModel):\n    \"\"\"The result of an image analysis.\"\"\"\n\n    theme: str = Field(description=\"The theme of the image\")\n    description: str = Field(description=\"A description of the image\")\n    objects: list[str] = Field(description=\"A list of objects found in the image\")\n\n# easy use of images as input with structured output\ndef analyse_image(image: Image) -> AnalysisResult:\n    \"\"\"\n    Analyse the image and extract its\n    theme, description and objects.\n    \"\"\"\n    return chain()\n\nresult = analyse_image(Image.open(\"examples/assets/old_chinese_temple.jpg\"))\n\nprint(\"Theme:\", result.theme)\nprint(\"Description:\", result.description)\nfor obj in result.objects:\n    print(\"Found this object:\", obj)\n```\n\n## Seamless local model support\n\n```python\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain, settings\n\n# auto-download the model from huggingface\nsettings.llm = \"ollama/openchat\"\n\nclass SentimentAnalysis(BaseModel):\n    analysis: str\n    sentiment: bool = Field(description=\"True for Happy, False for Sad\")\n\ndef analyze(text: str) -> SentimentAnalysis:\n    \"\"\"\n    Determines the sentiment of the text.\n    \"\"\"\n    return chain()\n\n# generates using the local model\npoem = analyze(\"I really like when my dog does a trick!\")\n\n# promised structured output (for local models!)\nprint(poem.analysis)\n```\n\n## Features\n\n- \ud83d\udc0d pythonic\n- \ud83d\udd00 easy swap between openai or local models\n- \ud83d\udd04 dynamic output types (pydantic models, or primitives)\n- \ud83d\udc41\ufe0f vision llm support\n- \ud83e\udde0 langchain_core as backend\n- \ud83d\udcdd jinja templating for prompts\n- \ud83c\udfd7\ufe0f reliable structured output\n- \ud83d\udd01 auto retry parsing\n- \ud83d\udd27 langsmith support\n- \ud83d\udd04 sync, async, streaming, parallel, fallbacks\n- \ud83d\udce6 gguf download from huggingface\n- \u2705 type hints for all functions and mypy support\n- \ud83d\udde3\ufe0f chat router component\n- \ud83e\udde9 composable with langchain LCEL\n- \ud83d\udee0\ufe0f easy error handling\n- \ud83d\udea6 enums and literal support\n- \ud83d\udcd0 custom parsing types\n\n## Documentation\n\n[Checkout the docs here](https://shroominic.github.io/funcchain/) \ud83d\udc48\n\nAlso highly recommend to try and run the examples in the `./examples` folder.\n\n## Contribution\n\nYou want to contribute? Thanks, that's great!\nFor more information checkout the [Contributing Guide](docs/contributing/dev-setup.md).\nPlease run the dev setup to get started:\n\n```bash\ngit clone https://github.com/shroominic/funcchain.git && cd funcchain\n\n./dev_setup.sh\n```\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "\ud83d\udd16 write prompts as python functions",
    "version": "0.3.3",
    "project_urls": {
        "Code": "https://github.com/shroominic/funcchain",
        "Documentation": "https://shroominic.github.io/funcchain"
    },
    "split_keywords": [
        "agent framework",
        " ai",
        " cognitive systems",
        " funcchain",
        " langchain",
        " llm",
        " pydantic",
        " pythonic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "bdba601e98faa34436eb1efe43e7f0227160bdc15ab5882a7f296c6debe369ec",
                "md5": "b2eb80d8b955af1a85a779eb6f7beac8",
                "sha256": "6ae38b2e41d9c2a1acc09aee15b4d8acbaab46beb5c8defc791d05f9e37b3daf"
            },
            "downloads": -1,
            "filename": "funcchain-0.3.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b2eb80d8b955af1a85a779eb6f7beac8",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<3.13,>=3.10",
            "size": 43000,
            "upload_time": "2024-04-20T10:54:49",
            "upload_time_iso_8601": "2024-04-20T10:54:49.268218Z",
            "url": "https://files.pythonhosted.org/packages/bd/ba/601e98faa34436eb1efe43e7f0227160bdc15ab5882a7f296c6debe369ec/funcchain-0.3.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e68d69e689cf0462aa5aa43c13f706bde8057d38ed68c7b106d2428436adf1c0",
                "md5": "a5de9b7cbf74e9e71afff126652b8d1a",
                "sha256": "5001142cbbdaaead26a1bedfc26d2cba69326f789c61a56cb4407bbeca94c679"
            },
            "downloads": -1,
            "filename": "funcchain-0.3.3.tar.gz",
            "has_sig": false,
            "md5_digest": "a5de9b7cbf74e9e71afff126652b8d1a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<3.13,>=3.10",
            "size": 408915,
            "upload_time": "2024-04-20T10:54:50",
            "upload_time_iso_8601": "2024-04-20T10:54:50.633469Z",
            "url": "https://files.pythonhosted.org/packages/e6/8d/69e689cf0462aa5aa43c13f706bde8057d38ed68c7b106d2428436adf1c0/funcchain-0.3.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-20 10:54:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "shroominic",
    "github_project": "funcchain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "funcchain"
}
        
Elapsed time: 0.23974s