bulk-chain


Namebulk-chain JSON
Version 0.24.1 PyPI version JSON
download
home_pagehttps://github.com/nicolay-r/bulk-chain
SummaryA lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests.
upload_time2024-11-02 12:36:59
maintainerNone
docs_urlNone
authorNicolay Rusnachenko
requires_python>=3.6
licenseMIT License
keywords natural language processing chain-of-thought reasoning
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # bulk-chain 0.24.1
![](https://img.shields.io/badge/Python-3.9-brightgreen.svg)
[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/nicolay-r/bulk-chain/blob/master/bulk_chain_tutorial.ipynb)
[![twitter](https://img.shields.io/twitter/url/https/shields.io.svg?style=social)](https://x.com/nicolayr_/status/1847969224636961033)

<p align="center">
    <img src="logo.png"/>
</p>

A lightweight, no-strings-attached **[Chain-of-Thought](https://arxiv.org/abs/2201.11903) framework** for your LLM, ensuring reliable results for bulk input requests stored in `CSV` / `JSONL` / `sqlite`.
It allows applying series of prompts formed into `schema` (See [related section](#chain-of-thought-schema))

### Features
* ✅ **No-strings**: you're free to LLM dependencies and flexible `venv` customization.
* ✅ **Provides iterator over infinite amount of input contexts** served in `CSV`/`JSONL`.
* ✅ **Progress caching**: withstanding exception during LLM calls by using `sqlite3` engine for caching LLM answers;
* ✅ **Support schemas descriptions** for Chain-of-Thought concept.

# Installation

```bash
pip install bulk-chain
```

## Chain-of-Thought Schema

To declare Chain-of-Though (CoT) schema, this project exploits `JSON` format.
This format adopts `name` field for declaring a name and `schema` is a list of CoT instructions for the Large Language Model.

Each step represents a dictionary with `prompt` and `out` keys that corresponds to the input prompt and output variable name respectively.
All the variable names are expected to be mentioned in `{}`.

Below, is an example on how to declare your own schema:

```python
{
"name": "schema-name",
"schema": [
    {"prompt": "Given the question '{text}', let's think step-by-step.", 
     "out": "steps"},
    {"prompt": "For the question '{text}' the reasoining steps are '{steps}'. what would be an answer?", 
     "out":  "answer"},
]
}
```

Another templates are available [here](/ext/schema/thor_cot_schema.json).

# Usage

Just **three** simple steps:

1. Define your [CoT Schema](#chain-of-thought-schema), or fetch it as shown below:
```bash
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/schema/default.json
```
2. Fetch or write your own **model** or pick the one [preset here](/ext/):
```bash
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/flan_t5.py
```

3. Launch inference in (chat mode):
```bash
!python -m bulk_chain.infer \
    --schema "default.json" \
    --adapter "dynamic:flan_t5.py:FlanT5" \
    %% \
    --device "cpu" \
    --temp 0.1
```

# Embed your LLM

All you have to do is to implement `BaseLM` class, that includes:
* `__init__` -- for initialization;
* `ask(prompt)` -- infer your model with the given `prompt`.

See examples with models [here](/ext).

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/nicolay-r/bulk-chain",
    "name": "bulk-chain",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.6",
    "maintainer_email": null,
    "keywords": "natural language processing, chain-of-thought, reasoning",
    "author": "Nicolay Rusnachenko",
    "author_email": "rusnicolay@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/c5/b6/c58d7ec7d4b45d5b68387f8deccc28c1881dd603e2522875fc65e07ce159/bulk_chain-0.24.1.tar.gz",
    "platform": null,
    "description": "# bulk-chain 0.24.1\n![](https://img.shields.io/badge/Python-3.9-brightgreen.svg)\n[![](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/nicolay-r/bulk-chain/blob/master/bulk_chain_tutorial.ipynb)\n[![twitter](https://img.shields.io/twitter/url/https/shields.io.svg?style=social)](https://x.com/nicolayr_/status/1847969224636961033)\n\n<p align=\"center\">\n    <img src=\"logo.png\"/>\n</p>\n\nA lightweight, no-strings-attached **[Chain-of-Thought](https://arxiv.org/abs/2201.11903) framework** for your LLM, ensuring reliable results for bulk input requests stored in `CSV` / `JSONL` / `sqlite`.\nIt allows applying series of prompts formed into `schema` (See [related section](#chain-of-thought-schema))\n\n### Features\n* \u2705 **No-strings**: you're free to LLM dependencies and flexible `venv` customization.\n* \u2705 **Provides iterator over infinite amount of input contexts** served in `CSV`/`JSONL`.\n* \u2705 **Progress caching**: withstanding exception during LLM calls by using `sqlite3` engine for caching LLM answers;\n* \u2705 **Support schemas descriptions** for Chain-of-Thought concept.\n\n# Installation\n\n```bash\npip install bulk-chain\n```\n\n## Chain-of-Thought Schema\n\nTo declare Chain-of-Though (CoT) schema, this project exploits `JSON` format.\nThis format adopts `name` field for declaring a name and `schema` is a list of CoT instructions for the Large Language Model.\n\nEach step represents a dictionary with `prompt` and `out` keys that corresponds to the input prompt and output variable name respectively.\nAll the variable names are expected to be mentioned in `{}`.\n\nBelow, is an example on how to declare your own schema:\n\n```python\n{\n\"name\": \"schema-name\",\n\"schema\": [\n    {\"prompt\": \"Given the question '{text}', let's think step-by-step.\", \n     \"out\": \"steps\"},\n    {\"prompt\": \"For the question '{text}' the reasoining steps are '{steps}'. what would be an answer?\", \n     \"out\":  \"answer\"},\n]\n}\n```\n\nAnother templates are available [here](/ext/schema/thor_cot_schema.json).\n\n# Usage\n\nJust **three** simple steps:\n\n1. Define your [CoT Schema](#chain-of-thought-schema), or fetch it as shown below:\n```bash\n!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/schema/default.json\n```\n2. Fetch or write your own **model** or pick the one [preset here](/ext/):\n```bash\n!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/flan_t5.py\n```\n\n3. Launch inference in (chat mode):\n```bash\n!python -m bulk_chain.infer \\\n    --schema \"default.json\" \\\n    --adapter \"dynamic:flan_t5.py:FlanT5\" \\\n    %% \\\n    --device \"cpu\" \\\n    --temp 0.1\n```\n\n# Embed your LLM\n\nAll you have to do is to implement `BaseLM` class, that includes:\n* `__init__` -- for initialization;\n* `ask(prompt)` -- infer your model with the given `prompt`.\n\nSee examples with models [here](/ext).\n",
    "bugtrack_url": null,
    "license": "MIT License",
    "summary": "A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests.",
    "version": "0.24.1",
    "project_urls": {
        "Homepage": "https://github.com/nicolay-r/bulk-chain"
    },
    "split_keywords": [
        "natural language processing",
        " chain-of-thought",
        " reasoning"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "499763a01da83cb8cc9bb093d8de6addcc752edabbe961dc110a0e267cae2235",
                "md5": "aea3840347cedb26fbfb7fbb44a5fc75",
                "sha256": "b25419a392e2fbb6c5907b992c64f1bf47b3ceb121b95d9eba94ca7f292fd13f"
            },
            "downloads": -1,
            "filename": "bulk_chain-0.24.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "aea3840347cedb26fbfb7fbb44a5fc75",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.6",
            "size": 13615,
            "upload_time": "2024-11-02T12:36:58",
            "upload_time_iso_8601": "2024-11-02T12:36:58.517084Z",
            "url": "https://files.pythonhosted.org/packages/49/97/63a01da83cb8cc9bb093d8de6addcc752edabbe961dc110a0e267cae2235/bulk_chain-0.24.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c5b6c58d7ec7d4b45d5b68387f8deccc28c1881dd603e2522875fc65e07ce159",
                "md5": "677f93dc90112200f3c4eeb59940f8da",
                "sha256": "45cd9b8df973d2b5ae8109d7174b665b5eaf5a91732c2e2b376b25f2a852c3a7"
            },
            "downloads": -1,
            "filename": "bulk_chain-0.24.1.tar.gz",
            "has_sig": false,
            "md5_digest": "677f93dc90112200f3c4eeb59940f8da",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.6",
            "size": 12671,
            "upload_time": "2024-11-02T12:36:59",
            "upload_time_iso_8601": "2024-11-02T12:36:59.908149Z",
            "url": "https://files.pythonhosted.org/packages/c5/b6/c58d7ec7d4b45d5b68387f8deccc28c1881dd603e2522875fc65e07ce159/bulk_chain-0.24.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-02 12:36:59",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "nicolay-r",
    "github_project": "bulk-chain",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "bulk-chain"
}
        
Elapsed time: 0.34821s