llm-loop


Namellm-loop JSON
Version 0.2 PyPI version JSON
download
home_pagehttps://github.com/chigwell/llm-loop
SummaryA utility package for querying language models with pattern matching and retry logic
upload_time2023-12-10 17:14:37
maintainer
docs_urlNone
authorEvgenii Evstafev
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # llm-loop

## Overview

`llm-loop` is a Python package designed to simplify the process of querying language models (like GPT or similar models) until a response matching a specified pattern is obtained or a maximum number of attempts is reached. This is particularly useful when working with AI models in scenarios where a specific format of response is required.

## Installation

```bash
pip install llm-loop
```

   This will install the necessary Python packages, including `ctransformers` and any other dependencies.

## Usage

Here's a basic example of how to use `llm-loop`:

1. **Import the necessary modules:**
   ```python
   import os
   from ctransformers import AutoModelForCausalLM, AutoTokenizer
   from llm_loop.main import LLMLoop
   ```

2. **Initialize the model with custom parameters:**
   ```python
   model_name = "YourModelName"
   model_file = "YourModelFileName"
   start_dir = '/path/to/your/model'
   model_path = f"{start_dir}/{model_file}"

   llm = AutoModelForCausalLM.from_pretrained(model_name, model_file=model_path, model_type='YourModelType', gpu_layers=YourGPULayers)
   ```

3. **Create an instance of LLMLoop and query the model:**
   ```python
   loop = LLMLoop(llm, 10)  # 10 is the maximum number of attempts

   prompt = "Your prompt here"
   pattern = r'Your regex pattern here'

   response = loop.query_llm(prompt=prompt, pattern=pattern)

   print("Response:", response)
   ```

## Contributing

Contributions to `llm-loop` are welcome! Please feel free to submit pull requests or open issues to suggest improvements or add new features.

## License

MIT.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/chigwell/llm-loop",
    "name": "llm-loop",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Evgenii Evstafev",
    "author_email": "chigwel@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/47/92/147b3760a05e28f00d715425f5dcbff0b815212507746ff324078ad3ae6b/llm-loop-0.2.tar.gz",
    "platform": null,
    "description": "# llm-loop\n\n## Overview\n\n`llm-loop` is a Python package designed to simplify the process of querying language models (like GPT or similar models) until a response matching a specified pattern is obtained or a maximum number of attempts is reached. This is particularly useful when working with AI models in scenarios where a specific format of response is required.\n\n## Installation\n\n```bash\npip install llm-loop\n```\n\n   This will install the necessary Python packages, including `ctransformers` and any other dependencies.\n\n## Usage\n\nHere's a basic example of how to use `llm-loop`:\n\n1. **Import the necessary modules:**\n   ```python\n   import os\n   from ctransformers import AutoModelForCausalLM, AutoTokenizer\n   from llm_loop.main import LLMLoop\n   ```\n\n2. **Initialize the model with custom parameters:**\n   ```python\n   model_name = \"YourModelName\"\n   model_file = \"YourModelFileName\"\n   start_dir = '/path/to/your/model'\n   model_path = f\"{start_dir}/{model_file}\"\n\n   llm = AutoModelForCausalLM.from_pretrained(model_name, model_file=model_path, model_type='YourModelType', gpu_layers=YourGPULayers)\n   ```\n\n3. **Create an instance of LLMLoop and query the model:**\n   ```python\n   loop = LLMLoop(llm, 10)  # 10 is the maximum number of attempts\n\n   prompt = \"Your prompt here\"\n   pattern = r'Your regex pattern here'\n\n   response = loop.query_llm(prompt=prompt, pattern=pattern)\n\n   print(\"Response:\", response)\n   ```\n\n## Contributing\n\nContributions to `llm-loop` are welcome! Please feel free to submit pull requests or open issues to suggest improvements or add new features.\n\n## License\n\nMIT.\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "A utility package for querying language models with pattern matching and retry logic",
    "version": "0.2",
    "project_urls": {
        "Homepage": "https://github.com/chigwell/llm-loop"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d8fa75c25eaa96461f9483f884fcd4e38c652a5d6de452aa35ca523664d2e317",
                "md5": "900f49d6462e130f59795f0e6896c288",
                "sha256": "0f33cf89d7d08f0cbf13c4b020126ed1ac50f32c408fa78d3fc3be411045fe3e"
            },
            "downloads": -1,
            "filename": "llm_loop-0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "900f49d6462e130f59795f0e6896c288",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 1850,
            "upload_time": "2023-12-10T17:14:35",
            "upload_time_iso_8601": "2023-12-10T17:14:35.187093Z",
            "url": "https://files.pythonhosted.org/packages/d8/fa/75c25eaa96461f9483f884fcd4e38c652a5d6de452aa35ca523664d2e317/llm_loop-0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4792147b3760a05e28f00d715425f5dcbff0b815212507746ff324078ad3ae6b",
                "md5": "9f702aa72e93627a3f21e1149138d034",
                "sha256": "02eadb0b7ed176210ad6cccf3b32864a0b65eef4e4a541386c10c9076d1bd9fd"
            },
            "downloads": -1,
            "filename": "llm-loop-0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "9f702aa72e93627a3f21e1149138d034",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 1882,
            "upload_time": "2023-12-10T17:14:37",
            "upload_time_iso_8601": "2023-12-10T17:14:37.395034Z",
            "url": "https://files.pythonhosted.org/packages/47/92/147b3760a05e28f00d715425f5dcbff0b815212507746ff324078ad3ae6b/llm-loop-0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-12-10 17:14:37",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "chigwell",
    "github_project": "llm-loop",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "llm-loop"
}
        
Elapsed time: 0.17488s