# funcchain
<!-- markdownlint-disable MD033 -->
[![Version](https://badge.fury.io/py/funcchain.svg)](https://badge.fury.io/py/funcchain)
[![tests](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml/badge.svg)](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml)
![PyVersion](https://img.shields.io/pypi/pyversions/funcchain)
![License](https://img.shields.io/github/license/shroominic/funcchain)
![Downloads](https://img.shields.io/pypi/dm/funcchain)
[![Discord](https://img.shields.io/discord/1192334452110659664?label=discord)](https://discord.gg/TrwWWMXdtR)
<img alt="GitHub Contributors" src="https://img.shields.io/github/contributors/shroominic/funcchain" />
<img alt="GitHub Last Commit" src="https://img.shields.io/github/last-commit/shroominic/funcchain" />
[![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://docs.pydantic.dev/latest/contributing/#badges)
[![Twitter Follow](https://img.shields.io/twitter/follow/shroominic?style=social)](https://x.com/shroominic)
<!-- <img alt="Repo Size" src="https://img.shields.io/github/repo-size/shroominic/funcchain" /> -->
```bash
pip install funcchain
```
## Introduction
`funcchain` is the *most pythonic* way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.
It utilizes OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.
In the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.
[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/ricklamers/funcchain-demo)
## Simple Demo
```python
from funcchain import chain
from pydantic import BaseModel
# define your output shape
class Recipe(BaseModel):
ingredients: list[str]
instructions: list[str]
duration: int
# write prompts utilising all native python features
def generate_recipe(topic: str) -> Recipe:
"""
Generate a recipe for a given topic.
"""
return chain() # <- this is doing all the magic
# generate llm response
recipe = generate_recipe("christmas dinner")
# recipe is automatically converted as pydantic model
print(recipe.ingredients)
```
## Complex Structured Output
```python
from pydantic import BaseModel, Field
from funcchain import chain
# define nested models
class Item(BaseModel):
name: str = Field(description="Name of the item")
description: str = Field(description="Description of the item")
keywords: list[str] = Field(description="Keywords for the item")
class ShoppingList(BaseModel):
items: list[Item]
store: str = Field(description="The store to buy the items from")
class TodoList(BaseModel):
todos: list[Item]
urgency: int = Field(description="The urgency of all tasks (1-10)")
# support for union types
def extract_list(user_input: str) -> TodoList | ShoppingList:
"""
The user input is either a shopping List or a todo list.
"""
return chain()
# the model will choose the output type automatically
lst = extract_list(
input("Enter your list: ")
)
# custom handler based on type
match lst:
case ShoppingList(items=items, store=store):
print("Here is your Shopping List: ")
for item in items:
print(f"{item.name}: {item.description}")
print(f"You need to go to: {store}")
case TodoList(todos=todos, urgency=urgency):
print("Here is your Todo List: ")
for item in todos:
print(f"{item.name}: {item.description}")
print(f"Urgency: {urgency}")
```
## Vision Models
```python
from funcchain import Image
from pydantic import BaseModel, Field
from funcchain import chain, settings
# set global llm using model identifiers (see MODELS.md)
settings.llm = "openai/gpt-4-vision-preview"
# everything defined is part of the prompt
class AnalysisResult(BaseModel):
"""The result of an image analysis."""
theme: str = Field(description="The theme of the image")
description: str = Field(description="A description of the image")
objects: list[str] = Field(description="A list of objects found in the image")
# easy use of images as input with structured output
def analyse_image(image: Image) -> AnalysisResult:
"""
Analyse the image and extract its
theme, description and objects.
"""
return chain()
result = analyse_image(Image.open("examples/assets/old_chinese_temple.jpg"))
print("Theme:", result.theme)
print("Description:", result.description)
for obj in result.objects:
print("Found this object:", obj)
```
## Seamless local model support
```python
from pydantic import BaseModel, Field
from funcchain import chain, settings
# auto-download the model from huggingface
settings.llm = "ollama/openchat"
class SentimentAnalysis(BaseModel):
analysis: str
sentiment: bool = Field(description="True for Happy, False for Sad")
def analyze(text: str) -> SentimentAnalysis:
"""
Determines the sentiment of the text.
"""
return chain()
# generates using the local model
poem = analyze("I really like when my dog does a trick!")
# promised structured output (for local models!)
print(poem.analysis)
```
## Features
- 🐍 pythonic
- 🔀 easy swap between openai or local models
- 🔄 dynamic output types (pydantic models, or primitives)
- 👁️ vision llm support
- 🧠 langchain_core as backend
- 📝 jinja templating for prompts
- 🏗️ reliable structured output
- 🔁 auto retry parsing
- 🔧 langsmith support
- 🔄 sync, async, streaming, parallel, fallbacks
- 📦 gguf download from huggingface
- ✅ type hints for all functions and mypy support
- 🗣️ chat router component
- 🧩 composable with langchain LCEL
- 🛠️ easy error handling
- 🚦 enums and literal support
- 📐 custom parsing types
## Documentation
[Checkout the docs here](https://shroominic.github.io/funcchain/) 👈
Also highly recommend to try and run the examples in the `./examples` folder.
## Contribution
You want to contribute? Thanks, that's great!
For more information checkout the [Contributing Guide](docs/contributing/dev-setup.md).
Please run the dev setup to get started:
```bash
git clone https://github.com/shroominic/funcchain.git && cd funcchain
./dev_setup.sh
```
Raw data
{
"_id": null,
"home_page": null,
"name": "funcchain",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.10",
"maintainer_email": null,
"keywords": "agent framework, ai, cognitive systems, funcchain, langchain, llm, pydantic, pythonic",
"author": null,
"author_email": "Shroominic <contact@shroominic.com>",
"download_url": "https://files.pythonhosted.org/packages/1b/69/851578d96bb359c86c343a98a47e1154981c34f6cb8405062ee4884b3a50/funcchain-0.3.6.tar.gz",
"platform": null,
"description": "\n# funcchain\n\n<!-- markdownlint-disable MD033 -->\n[![Version](https://badge.fury.io/py/funcchain.svg)](https://badge.fury.io/py/funcchain)\n[![tests](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml/badge.svg)](https://github.com/shroominic/funcchain/actions/workflows/code-check.yml)\n![PyVersion](https://img.shields.io/pypi/pyversions/funcchain)\n![License](https://img.shields.io/github/license/shroominic/funcchain)\n![Downloads](https://img.shields.io/pypi/dm/funcchain)\n[![Discord](https://img.shields.io/discord/1192334452110659664?label=discord)](https://discord.gg/TrwWWMXdtR)\n<img alt=\"GitHub Contributors\" src=\"https://img.shields.io/github/contributors/shroominic/funcchain\" />\n<img alt=\"GitHub Last Commit\" src=\"https://img.shields.io/github/last-commit/shroominic/funcchain\" />\n[![Pydantic v2](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/pydantic/pydantic/main/docs/badge/v2.json)](https://docs.pydantic.dev/latest/contributing/#badges)\n[![Twitter Follow](https://img.shields.io/twitter/follow/shroominic?style=social)](https://x.com/shroominic)\n<!-- <img alt=\"Repo Size\" src=\"https://img.shields.io/github/repo-size/shroominic/funcchain\" /> -->\n\n```bash\npip install funcchain\n```\n\n## Introduction\n\n`funcchain` is the *most pythonic* way of writing cognitive systems. Leveraging pydantic models as output schemas combined with langchain in the backend allows for a seamless integration of llms into your apps.\nIt utilizes OpenAI Functions or LlamaCpp grammars (json-schema-mode) for efficient structured output.\nIn the backend it compiles the funcchain syntax into langchain runnables so you can easily invoke, stream or batch process your pipelines.\n\n[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/ricklamers/funcchain-demo)\n\n## Simple Demo\n\n```python\nfrom funcchain import chain\nfrom pydantic import BaseModel\n\n# define your output shape\nclass Recipe(BaseModel):\n ingredients: list[str]\n instructions: list[str]\n duration: int\n\n# write prompts utilising all native python features\ndef generate_recipe(topic: str) -> Recipe:\n \"\"\"\n Generate a recipe for a given topic.\n \"\"\"\n return chain() # <- this is doing all the magic\n\n# generate llm response\nrecipe = generate_recipe(\"christmas dinner\")\n\n# recipe is automatically converted as pydantic model\nprint(recipe.ingredients)\n```\n\n## Complex Structured Output\n\n```python\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain\n\n# define nested models\nclass Item(BaseModel):\n name: str = Field(description=\"Name of the item\")\n description: str = Field(description=\"Description of the item\")\n keywords: list[str] = Field(description=\"Keywords for the item\")\n\nclass ShoppingList(BaseModel):\n items: list[Item]\n store: str = Field(description=\"The store to buy the items from\")\n\nclass TodoList(BaseModel):\n todos: list[Item]\n urgency: int = Field(description=\"The urgency of all tasks (1-10)\")\n\n# support for union types\ndef extract_list(user_input: str) -> TodoList | ShoppingList:\n \"\"\"\n The user input is either a shopping List or a todo list.\n \"\"\"\n return chain()\n\n# the model will choose the output type automatically\nlst = extract_list(\n input(\"Enter your list: \")\n)\n\n# custom handler based on type\nmatch lst:\n case ShoppingList(items=items, store=store):\n print(\"Here is your Shopping List: \")\n for item in items:\n print(f\"{item.name}: {item.description}\")\n print(f\"You need to go to: {store}\")\n\n case TodoList(todos=todos, urgency=urgency):\n print(\"Here is your Todo List: \")\n for item in todos:\n print(f\"{item.name}: {item.description}\")\n print(f\"Urgency: {urgency}\")\n```\n\n## Vision Models\n\n```python\nfrom funcchain import Image\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain, settings\n\n# set global llm using model identifiers (see MODELS.md)\nsettings.llm = \"openai/gpt-4-vision-preview\"\n\n# everything defined is part of the prompt\nclass AnalysisResult(BaseModel):\n \"\"\"The result of an image analysis.\"\"\"\n\n theme: str = Field(description=\"The theme of the image\")\n description: str = Field(description=\"A description of the image\")\n objects: list[str] = Field(description=\"A list of objects found in the image\")\n\n# easy use of images as input with structured output\ndef analyse_image(image: Image) -> AnalysisResult:\n \"\"\"\n Analyse the image and extract its\n theme, description and objects.\n \"\"\"\n return chain()\n\nresult = analyse_image(Image.open(\"examples/assets/old_chinese_temple.jpg\"))\n\nprint(\"Theme:\", result.theme)\nprint(\"Description:\", result.description)\nfor obj in result.objects:\n print(\"Found this object:\", obj)\n```\n\n## Seamless local model support\n\n```python\nfrom pydantic import BaseModel, Field\nfrom funcchain import chain, settings\n\n# auto-download the model from huggingface\nsettings.llm = \"ollama/openchat\"\n\nclass SentimentAnalysis(BaseModel):\n analysis: str\n sentiment: bool = Field(description=\"True for Happy, False for Sad\")\n\ndef analyze(text: str) -> SentimentAnalysis:\n \"\"\"\n Determines the sentiment of the text.\n \"\"\"\n return chain()\n\n# generates using the local model\npoem = analyze(\"I really like when my dog does a trick!\")\n\n# promised structured output (for local models!)\nprint(poem.analysis)\n```\n\n## Features\n\n- \ud83d\udc0d pythonic\n- \ud83d\udd00 easy swap between openai or local models\n- \ud83d\udd04 dynamic output types (pydantic models, or primitives)\n- \ud83d\udc41\ufe0f vision llm support\n- \ud83e\udde0 langchain_core as backend\n- \ud83d\udcdd jinja templating for prompts\n- \ud83c\udfd7\ufe0f reliable structured output\n- \ud83d\udd01 auto retry parsing\n- \ud83d\udd27 langsmith support\n- \ud83d\udd04 sync, async, streaming, parallel, fallbacks\n- \ud83d\udce6 gguf download from huggingface\n- \u2705 type hints for all functions and mypy support\n- \ud83d\udde3\ufe0f chat router component\n- \ud83e\udde9 composable with langchain LCEL\n- \ud83d\udee0\ufe0f easy error handling\n- \ud83d\udea6 enums and literal support\n- \ud83d\udcd0 custom parsing types\n\n## Documentation\n\n[Checkout the docs here](https://shroominic.github.io/funcchain/) \ud83d\udc48\n\nAlso highly recommend to try and run the examples in the `./examples` folder.\n\n## Contribution\n\nYou want to contribute? Thanks, that's great!\nFor more information checkout the [Contributing Guide](docs/contributing/dev-setup.md).\nPlease run the dev setup to get started:\n\n```bash\ngit clone https://github.com/shroominic/funcchain.git && cd funcchain\n\n./dev_setup.sh\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "\ud83d\udd16 write prompts as python functions",
"version": "0.3.6",
"project_urls": {
"Code": "https://github.com/shroominic/funcchain",
"Documentation": "https://shroominic.github.io/funcchain"
},
"split_keywords": [
"agent framework",
" ai",
" cognitive systems",
" funcchain",
" langchain",
" llm",
" pydantic",
" pythonic"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "34f30b462a8b92fedc0ba2c40675b17574c940d71cb019fafed417ba47694c12",
"md5": "836b01899995e93cc8af04fcd61bbb42",
"sha256": "572e4dc1a8670a905a27a12c9a1f5dd242d47c9facd0d9d39ada366a83ec0d51"
},
"downloads": -1,
"filename": "funcchain-0.3.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "836b01899995e93cc8af04fcd61bbb42",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.10",
"size": 43318,
"upload_time": "2024-11-19T09:40:41",
"upload_time_iso_8601": "2024-11-19T09:40:41.149525Z",
"url": "https://files.pythonhosted.org/packages/34/f3/0b462a8b92fedc0ba2c40675b17574c940d71cb019fafed417ba47694c12/funcchain-0.3.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "1b69851578d96bb359c86c343a98a47e1154981c34f6cb8405062ee4884b3a50",
"md5": "ce6bbff67537cf630dcfe69bf8f0799f",
"sha256": "6137919652e8b72faf3dfe16822bce5cbcb062ba85b74632eb68ff4faefc5f34"
},
"downloads": -1,
"filename": "funcchain-0.3.6.tar.gz",
"has_sig": false,
"md5_digest": "ce6bbff67537cf630dcfe69bf8f0799f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.10",
"size": 409206,
"upload_time": "2024-11-19T09:40:42",
"upload_time_iso_8601": "2024-11-19T09:40:42.375363Z",
"url": "https://files.pythonhosted.org/packages/1b/69/851578d96bb359c86c343a98a47e1154981c34f6cb8405062ee4884b3a50/funcchain-0.3.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-19 09:40:42",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "shroominic",
"github_project": "funcchain",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "funcchain"
}