# chaincrafter
Seamless integration and composability for large language model apps.
## Features
* Composable prompts and chains
* Use multiple models to run one chain and then use that as input for a different chain and model
* Customizable prompt and response formatting
* Add modifiers to prompts to change the style, length, and format of the response
* Extract data from the response to use in the next prompt
* Add custom functions to process the response
* Add custom functions to process the input variables
* Integration with OpenAI API (llama.cpp in progress)
* Async calls to models
* Load Prompts and Chains from YAML using Catalogs
* Makes it easier to share prompts and chains between projects
* Build up a prompts library
## Installation
```bash
pip install chaincrafter
```
## Usage
1. Define your prompts and the variables that they expect
- The input variables can be of any type, and can be processed by a function
- The prompt message is treated as an _f-string_
2. Define your chain of prompts
- The chain is a list of tuples, where each tuple contains a prompt and the output key to store the response in
- The output key is used to access the response in the next prompt
3. Set up the models that you want to use
4. Run the chain using the models
```python
from chaincrafter import Chain, Prompt
from chaincrafter.models import OpenAiChat
chat_model = OpenAiChat(temperature=0.65, model_name="gpt-3.5-turbo")
system_prompt = Prompt("You are a helpful assistant who responds to questions about the world")
hello_prompt = Prompt("Hello, what is the capital of France? Answer only with the city name.")
followup_prompt = Prompt("{city} sounds like a nice place to visit. What is the population of {city}?")
chain = Chain(
system_prompt,
(hello_prompt, "city"),
(followup_prompt, "followup_response"),
)
messages = chain.run(chat_model)
for message in messages:
print(f"{message['role']}: {message['content']}")
```
### Running the examples
```bash
source venv/bin/activate
export OPENAI_API_KEY="..."
python -m examples.interesting_facts
python -m examples.interesting_facts_catalog
```
Raw data
{
"_id": null,
"home_page": "https://github.com/rudolfolah/chaincrafter",
"name": "chaincrafter",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "machine learning,ai,openai,llama,ml,gpt-4,gpt-3.5",
"author": "Rudolf Olah",
"author_email": "Rudolf Olah <rudolf.olah.to@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/2c/ad/88d376e98ca910ccc0d252fa20a094fbb962d6670b01e00fec45556529d7/chaincrafter-0.2.3.tar.gz",
"platform": null,
"description": "# chaincrafter\n\nSeamless integration and composability for large language model apps.\n\n## Features\n* Composable prompts and chains\n * Use multiple models to run one chain and then use that as input for a different chain and model\n* Customizable prompt and response formatting\n * Add modifiers to prompts to change the style, length, and format of the response\n * Extract data from the response to use in the next prompt\n * Add custom functions to process the response\n * Add custom functions to process the input variables\n* Integration with OpenAI API (llama.cpp in progress)\n* Async calls to models\n* Load Prompts and Chains from YAML using Catalogs\n * Makes it easier to share prompts and chains between projects\n * Build up a prompts library\n\n## Installation\n```bash\npip install chaincrafter\n```\n\n## Usage\n\n1. Define your prompts and the variables that they expect\n - The input variables can be of any type, and can be processed by a function\n - The prompt message is treated as an _f-string_\n2. Define your chain of prompts\n - The chain is a list of tuples, where each tuple contains a prompt and the output key to store the response in\n - The output key is used to access the response in the next prompt\n3. Set up the models that you want to use\n4. Run the chain using the models\n\n```python\nfrom chaincrafter import Chain, Prompt\nfrom chaincrafter.models import OpenAiChat\n\nchat_model = OpenAiChat(temperature=0.65, model_name=\"gpt-3.5-turbo\")\nsystem_prompt = Prompt(\"You are a helpful assistant who responds to questions about the world\")\nhello_prompt = Prompt(\"Hello, what is the capital of France? Answer only with the city name.\")\nfollowup_prompt = Prompt(\"{city} sounds like a nice place to visit. What is the population of {city}?\")\nchain = Chain(\n system_prompt,\n (hello_prompt, \"city\"),\n (followup_prompt, \"followup_response\"),\n)\nmessages = chain.run(chat_model)\nfor message in messages:\n print(f\"{message['role']}: {message['content']}\")\n```\n\n### Running the examples\n```bash\nsource venv/bin/activate\nexport OPENAI_API_KEY=\"...\"\npython -m examples.interesting_facts\npython -m examples.interesting_facts_catalog\n```\n",
"bugtrack_url": null,
"license": "",
"summary": "Seamless integration and composability for large language model apps.",
"version": "0.2.3",
"project_urls": {
"Bug Tracker": "https://github.com/rudolfolah/chaincrafter/issues",
"Homepage": "https://rudolfolah.github.io/chaincrafter/",
"Repository": "https://github.com/rudolfolah/chaincrafter"
},
"split_keywords": [
"machine learning",
"ai",
"openai",
"llama",
"ml",
"gpt-4",
"gpt-3.5"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "6510bc57bd1a10a2829163d9fcda390a0108e387b194500cf5d5e0971855690d",
"md5": "a87187e445bbdbdf477d17e2428331bf",
"sha256": "660500d91f484eb2580e85b0f91daea55400b0805883b9cca8871861d3aed3f6"
},
"downloads": -1,
"filename": "chaincrafter-0.2.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a87187e445bbdbdf477d17e2428331bf",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9916,
"upload_time": "2023-09-13T03:37:42",
"upload_time_iso_8601": "2023-09-13T03:37:42.747452Z",
"url": "https://files.pythonhosted.org/packages/65/10/bc57bd1a10a2829163d9fcda390a0108e387b194500cf5d5e0971855690d/chaincrafter-0.2.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "2cad88d376e98ca910ccc0d252fa20a094fbb962d6670b01e00fec45556529d7",
"md5": "92be282ee52abce9514050ce49da1f44",
"sha256": "3c3969a2f8df827f9ef86a780e29a1d04bc2c74645f3903abf0d23762a2d7854"
},
"downloads": -1,
"filename": "chaincrafter-0.2.3.tar.gz",
"has_sig": false,
"md5_digest": "92be282ee52abce9514050ce49da1f44",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 9975,
"upload_time": "2023-09-13T03:37:44",
"upload_time_iso_8601": "2023-09-13T03:37:44.323962Z",
"url": "https://files.pythonhosted.org/packages/2c/ad/88d376e98ca910ccc0d252fa20a094fbb962d6670b01e00fec45556529d7/chaincrafter-0.2.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2023-09-13 03:37:44",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "rudolfolah",
"github_project": "chaincrafter",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "chaincrafter"
}