<img src="https://github.com/WouterDurnez/fence/blob/main/docs/logo.png?raw=true" alt="tests" height="200"/>
[![Python](https://img.shields.io/pypi/pyversions/fence-llm)](https://pypi.org/project/fence-llm/)
[![Test Status](https://github.com/WouterDurnez/fence/actions/workflows/ci-pipeline.yaml/badge.svg)](https://github.com/WouterDurnez/fence/actions)
[![codecov](https://codecov.io/gh/WouterDurnez/fence/branch/main/graph/badge.svg?token=QZQZQZQZQZ)](https://codecov.io/gh/WouterDurnez/fence)
[![PyPI version](https://badge.fury.io/py/fence-llm.svg)](https://badge.fury.io/py/fence-llm)
[![Documentation Status](https://readthedocs.org/projects/fence-llm/badge/?version=latest)](https://fence-llm.readthedocs.io/en/latest/?badge=latest)
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)](https://github.com/pre-commit/pre-commit)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](code_of_conduct.md)
# ๐คบ Fence
`Fence` is a simple, lightweight library for LLM communication. A lot of the functionality was inspired by/derived of LangChain (the OG LLM package) basics, since that's how the package was born - as a stripped down version of LangChain functionality, with cooler names.
## ๐ค Raison d'รชtre
The simple answer: by accident. The slightly longer answer: LangChain used to be (is?) a pretty big package with a ton of dependencies. The upside is that it's powerful for PoC purposes, because it has it all.
The downsides:
- It's **_big_**. It takes up a lot of space (which can be an issue in some environments/runtimes), often for functionality that isn't needed.
- It's fairly **_complex_**. It's a big package with a lot of functionality, which can be overwhelming for new users.
- It **_wasn't exactly dependable_** in an industrial setting before. Version jumps were common, and the package was often broken after a new release.
As a result, many developers (particularly those working in large production environments) have advocated for more lightweight, custom functionality that favors stability and robustness.
### Circling back: why Fence?
Since our work was in a production environment, mostly dealing with Bedrock, we just started building some **basic components** from scratch. We needed a way to communicate with our models, which turned out to as the `Link` class (_wink wink_).
Then, some other things were added left and right, and this eventually turned into a miniature package. Not in small part because it was fun to go down this road. But mostly because it strikes the right balance between convenience and flexiblity.
Naturally, it's nowhere as powerful as, for instance, LangChain. If you want to build a quick PoC with relatively complex logic, maybe go for the OG instead. If you want to be set on your way with a simple, lightweight package that's easy to understand and extend, Fence might be the way to go.
## ๐ ๏ธ How do I use it?
Fence just has a few basic components. See the [notebooks](notebooks) for examples on how to use them. Documentation is coming soon, but for now, you can check out the [source code](fence) for more details.
## ๐ฆ Installation
You can install Fence from PyPI:
```bash
pip install fence-llm
```
## ๐ Look ma, no dependencies (kinda)!
Here's a hello world example:
```python
from fence import Link
from fence.templates.string import StringTemplate
from fence.models.openai import GPT4omini
# Create a link
link = Link(
model=GPT4omini(),
template=StringTemplate("Write a poem about the value of a {topic}!"),
name='hello_world_link'
)
# Run the link
output = link.run(topic='fence')['state']
print(output)
```
This will output something like:
```bash
[2024-10-04 17:45:15] [โน๏ธ INFO] [links.run:203] Executing <hello_world_link> Link
Sturdy wood and nails,
Boundaries draw peace and calm,
Guarding hearts within.
```
Much wow, very next level. There's more in the [notebook](notebooks) section, with a _lot_ more to cover!
## ๐ช Features
### What can I do with Fence?
- **Uniform interface for `LLMs`**. Since our main use case was Bedrock, we built Fence to work with Bedrock models. However, it also has openAI support, and it's easy to extend to other models (contributors welcome!)
- **Links and Chains** help you build complex pipelines with multiple models. This is a feature that's been around since LangChain, and it's still here. You can parametrize templates, and pass the output of one model to another.
- **Template classes** that handle the basics, and that work across models (e.g., a MessageTemplate can be sent to a Bedrock Claude3 model, _or_ to an openAI model - system/user/assistant formatting is handled under the hood).
- **Agents** to move on to the sweet, sweet next level of LLM orchestration. Built using the ReAct pattern.
- **Basic utils on board** for typical tasks like retries, parallelization, logging, output parsers, etc.
### What can't I do with Fence?
It's obviously not as powerful as some of the other packages out there, that hold tons more of features. We're also not trying to fall into the trap of building 'yet another framework' (insert [XKCD](https://xkcd.com/927/) here), so we're trying to guard our scope. If you need a lot of bells and whistles, you might want to look at any of these:
- [`LangChain`](https://www.langchain.com/)
The OG, no explanation needed.
- [`Griptape`](https://www.griptape.ai)
A more recent package, with a lot of cool features! Great for building PoCs, too. Built by ex-AWS folks, and promises to be a lot more industry-oriented.
## ๐บ๏ธ Roadmap
- [ ] Add more models (e.g., native Anthropic models)
- [ ] Add more tests ๐ฌ
- [ ] Add more notebook tutorials to showcase features
## ๐ค Contributing
We welcome contributions! Check out the [CONTRIBUTING.md](CONTRIBUTING.md) for more details.
Raw data
{
"_id": null,
"home_page": "https://github.com/WouterDurnez/fence",
"name": "fence-llm",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.11",
"maintainer_email": null,
"keywords": "openai, llm, gpt, ai, nlp, language, model, api, anthropic, claude, wrapper, fence",
"author": "wouter.durnez",
"author_email": "wouter.durnez@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/75/a7/0b96665d4737a959f21bf78dc66edae00e014f103ceff0fc0c54973ee877/fence_llm-0.0.21.tar.gz",
"platform": null,
"description": "<img src=\"https://github.com/WouterDurnez/fence/blob/main/docs/logo.png?raw=true\" alt=\"tests\" height=\"200\"/>\n\n[![Python](https://img.shields.io/pypi/pyversions/fence-llm)](https://pypi.org/project/fence-llm/)\n[![Test Status](https://github.com/WouterDurnez/fence/actions/workflows/ci-pipeline.yaml/badge.svg)](https://github.com/WouterDurnez/fence/actions)\n[![codecov](https://codecov.io/gh/WouterDurnez/fence/branch/main/graph/badge.svg?token=QZQZQZQZQZ)](https://codecov.io/gh/WouterDurnez/fence)\n[![PyPI version](https://badge.fury.io/py/fence-llm.svg)](https://badge.fury.io/py/fence-llm)\n[![Documentation Status](https://readthedocs.org/projects/fence-llm/badge/?version=latest)](https://fence-llm.readthedocs.io/en/latest/?badge=latest)\n[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)\n[![pre-commit](https://img.shields.io/badge/pre--commit-enabled-brightgreen?logo=pre-commit&logoColor=white)](https://github.com/pre-commit/pre-commit)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Contributor Covenant](https://img.shields.io/badge/Contributor%20Covenant-2.1-4baaaa.svg)](code_of_conduct.md)\n\n# \ud83e\udd3a Fence\n\n`Fence` is a simple, lightweight library for LLM communication. A lot of the functionality was inspired by/derived of LangChain (the OG LLM package) basics, since that's how the package was born - as a stripped down version of LangChain functionality, with cooler names.\n\n## \ud83e\udd14 Raison d'\u00eatre\n\nThe simple answer: by accident. The slightly longer answer: LangChain used to be (is?) a pretty big package with a ton of dependencies. The upside is that it's powerful for PoC purposes, because it has it all.\n\nThe downsides:\n\n- It's **_big_**. It takes up a lot of space (which can be an issue in some environments/runtimes), often for functionality that isn't needed.\n- It's fairly **_complex_**. It's a big package with a lot of functionality, which can be overwhelming for new users.\n- It **_wasn't exactly dependable_** in an industrial setting before. Version jumps were common, and the package was often broken after a new release.\n\nAs a result, many developers (particularly those working in large production environments) have advocated for more lightweight, custom functionality that favors stability and robustness.\n\n### Circling back: why Fence?\n\nSince our work was in a production environment, mostly dealing with Bedrock, we just started building some **basic components** from scratch. We needed a way to communicate with our models, which turned out to as the `Link` class (_wink wink_).\nThen, some other things were added left and right, and this eventually turned into a miniature package. Not in small part because it was fun to go down this road. But mostly because it strikes the right balance between convenience and flexiblity.\n\nNaturally, it's nowhere as powerful as, for instance, LangChain. If you want to build a quick PoC with relatively complex logic, maybe go for the OG instead. If you want to be set on your way with a simple, lightweight package that's easy to understand and extend, Fence might be the way to go.\n\n## \ud83d\udee0\ufe0f How do I use it?\n\nFence just has a few basic components. See the [notebooks](notebooks) for examples on how to use them. Documentation is coming soon, but for now, you can check out the [source code](fence) for more details.\n\n## \ud83d\udce6 Installation\n\nYou can install Fence from PyPI:\n\n```bash\npip install fence-llm\n```\n\n## \ud83d\udc4b Look ma, no dependencies (kinda)!\n\nHere's a hello world example:\n\n```python\nfrom fence import Link\nfrom fence.templates.string import StringTemplate\nfrom fence.models.openai import GPT4omini\n\n# Create a link\nlink = Link(\n model=GPT4omini(),\n template=StringTemplate(\"Write a poem about the value of a {topic}!\"),\n name='hello_world_link'\n)\n\n# Run the link\noutput = link.run(topic='fence')['state']\nprint(output)\n```\n\nThis will output something like:\n\n```bash\n[2024-10-04 17:45:15] [\u2139\ufe0f INFO] [links.run:203] Executing <hello_world_link> Link\nSturdy wood and nails,\nBoundaries draw peace and calm,\nGuarding hearts within.\n```\n\nMuch wow, very next level. There's more in the [notebook](notebooks) section, with a _lot_ more to cover!\n\n## \ud83d\udcaa Features\n\n### What can I do with Fence?\n\n- **Uniform interface for `LLMs`**. Since our main use case was Bedrock, we built Fence to work with Bedrock models. However, it also has openAI support, and it's easy to extend to other models (contributors welcome!)\n- **Links and Chains** help you build complex pipelines with multiple models. This is a feature that's been around since LangChain, and it's still here. You can parametrize templates, and pass the output of one model to another.\n- **Template classes** that handle the basics, and that work across models (e.g., a MessageTemplate can be sent to a Bedrock Claude3 model, _or_ to an openAI model - system/user/assistant formatting is handled under the hood).\n- **Agents** to move on to the sweet, sweet next level of LLM orchestration. Built using the ReAct pattern.\n- **Basic utils on board** for typical tasks like retries, parallelization, logging, output parsers, etc.\n\n### What can't I do with Fence?\n\nIt's obviously not as powerful as some of the other packages out there, that hold tons more of features. We're also not trying to fall into the trap of building 'yet another framework' (insert [XKCD](https://xkcd.com/927/) here), so we're trying to guard our scope. If you need a lot of bells and whistles, you might want to look at any of these:\n\n- [`LangChain`](https://www.langchain.com/)\n\nThe OG, no explanation needed.\n\n- [`Griptape`](https://www.griptape.ai)\n\nA more recent package, with a lot of cool features! Great for building PoCs, too. Built by ex-AWS folks, and promises to be a lot more industry-oriented.\n\n## \ud83d\uddfa\ufe0f Roadmap\n\n- [ ] Add more models (e.g., native Anthropic models)\n- [ ] Add more tests \ud83d\ude2c\n- [ ] Add more notebook tutorials to showcase features\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Check out the [CONTRIBUTING.md](CONTRIBUTING.md) for more details.\n\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "Keep the bloat out! - A lightweight LLM interaction library",
"version": "0.0.21",
"project_urls": {
"Documentation": "https://github.com/WouterDurnez/fence",
"Homepage": "https://github.com/WouterDurnez/fence",
"Repository": "https://github.com/WouterDurnez/fence"
},
"split_keywords": [
"openai",
" llm",
" gpt",
" ai",
" nlp",
" language",
" model",
" api",
" anthropic",
" claude",
" wrapper",
" fence"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "a95f044c44f21dd3171d0601a73acf32c70595994cbe35ecbba5dd65706985bc",
"md5": "b9b20d14f69abfbe6c169c9280cc39fc",
"sha256": "efb9f3a38cb703530242712a400c48047520ef419b78e54e2482655ca93ab34f"
},
"downloads": -1,
"filename": "fence_llm-0.0.21-py3-none-any.whl",
"has_sig": false,
"md5_digest": "b9b20d14f69abfbe6c169c9280cc39fc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.11",
"size": 60810,
"upload_time": "2024-10-17T11:52:26",
"upload_time_iso_8601": "2024-10-17T11:52:26.966022Z",
"url": "https://files.pythonhosted.org/packages/a9/5f/044c44f21dd3171d0601a73acf32c70595994cbe35ecbba5dd65706985bc/fence_llm-0.0.21-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "75a70b96665d4737a959f21bf78dc66edae00e014f103ceff0fc0c54973ee877",
"md5": "62d825a738b832ac20edb760c115c8cd",
"sha256": "788a8985354f651037006fa63c004e6cac6b5b912b476dd480a51400854878ca"
},
"downloads": -1,
"filename": "fence_llm-0.0.21.tar.gz",
"has_sig": false,
"md5_digest": "62d825a738b832ac20edb760c115c8cd",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.11",
"size": 42220,
"upload_time": "2024-10-17T11:52:28",
"upload_time_iso_8601": "2024-10-17T11:52:28.234735Z",
"url": "https://files.pythonhosted.org/packages/75/a7/0b96665d4737a959f21bf78dc66edae00e014f103ceff0fc0c54973ee877/fence_llm-0.0.21.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-17 11:52:28",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "WouterDurnez",
"github_project": "fence",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "fence-llm"
}