llm_wrapper


Namellm_wrapper JSON
Version 0.1.8 PyPI version JSON
download
home_pagehttps://github.com/meirm/llm_functions
SummaryRun llm functions with just inline documentation and no code
upload_time2024-04-30 08:14:52
maintainerNone
docs_urlNone
authorMeir Michanie
requires_python>=3.11
licenseNone
keywords openai langchain pydantic
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLM Functions Library Usage Guide

This guide provides a comprehensive overview of how to use the `llm_wrapper` library, which offers a versatile wrapper, `llm_func`, designed for seamless interactions with various language learning models (LLMs). The wrapper simplifies model initialization, query execution, and structured output parsing, supporting a wide range of return types including basic data types (`int`, `float`, `str`, `bool`, `set`, `list`, `dict`) and complex Pydantic `BaseModel` structures.

## Getting Started
```python
from llm_wrapper import llm_func
from langchain_openai import OpenAI

@llm_func
def famous_quote() -> str:
    """Returns a famous quote according to the subject provided."""
    pass

llm = OpenAI()

query = "Peace and War"
quote = famous_quote(llm=llm, query=query)
print(quote)  # Output: "Peace is not a relationship of nations. It is a condition of mind brought about by a serenity of soul. Peace is not merely the absence of war. It is also a state of mind. Lasting peace can come only to peaceful people. - Jawaharlal Nehru

@llm_func
def check_grammar() -> float:
    """Check the grammar of the sentence and return a float number between 0 and 1 reflecting its correctness."""
    pass

query = "I are a student."
correctness = check_grammar(llm=llm, query=query)
print(correctness)  # Output: 0.5
query = "I am a student."
correctness = check_grammar(llm=llm, query=query)
print(correctness)  # Output: 1.0
```
### Installation

Ensure the `llm_wrapper` library is installed in your environment. You can install it using pip:

```bash
pip install llm_wrapper
```

### Importing the Library

Start by importing the necessary components:

```python
from llm_wrapper import llm_func
from pydantic import BaseModel
```

### Initializing Your LLM Object

You'll need to instantiate your preferred LLM object. This library is designed to work flexibly with various LLMs:

```python
llm = YourPreferredLLM()
```

## Using the `llm_func` Wrapper

The `llm_func` wrapper is designed to streamline your interaction with LLMs. It automatically handles functions returning basic types (`int`, `float`, `str`, `bool`) or Pydantic `BaseModel` instances.

### Defining Functions with `@llm_func`

Annotate your functions with `@llm_func` and define a clear return type. Here's how to define functions returning basic types and Pydantic models:

#### Basic Types

```python
@llm_func
def calculate_score() -> int:
    """Returns an integer score based on the input text."""
    pass

@llm_func
def is_valid() -> bool:
    """Determines if the text meets certain criteria, returning True or False."""
    pass
```

#### Pydantic BaseModel

```python
class User(BaseModel):
    name: str
    age: int

@llm_func
def get_user_details() -> User:
    """Extracts user details from the text and returns them as a User model."""
    pass
```

### Executing Queries

Pass your query to the function, along with the instantiated LLM object. The wrapper will process the input and return a structured output based on the defined return type.

```python
query = "Calculate the score for the following text..."
score = calculate_score(llm=llm, query=query)
print(score)  # Output will be of type int

query = "Check if the following text is valid..."
validity = is_valid(llm=llm, query=query)
print(validity)  # Output will be of type bool

query = "Extract user details from the following text..."
user_details = get_user_details(llm=llm, query=query)
print(user_details)  # Output will be a User instance
```

### Support and Development

Currently, `llm_func` supports functions returning basic data types (`int`, `float`, `str`, `bool`, `set`, `list`, `dict`) and Pydantic `BaseModel` instances. Support for additional types is under active development, and updates will be released periodically to enhance the library's functionality.

By following these guidelines, you can efficiently use the `llm_wrapper` library to interact with language models, perform queries, and handle structured outputs, all while writing clear and maintainable code.


### Documentation
[llm_wrapper Tutorial](https://github.com/meirm/llm_wrapper/blob/main/docs/tutorial.md)

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/meirm/llm_functions",
    "name": "llm_wrapper",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "openai, langchain, pydantic",
    "author": "Meir Michanie",
    "author_email": "meirm@riunx.com",
    "download_url": "https://files.pythonhosted.org/packages/c7/f0/984e8494a9739cc02532aa86d5bea0bf64439d97b7b32b61dcc0eb90aab5/llm_wrapper-0.1.8.tar.gz",
    "platform": null,
    "description": "# LLM Functions Library Usage Guide\n\nThis guide provides a comprehensive overview of how to use the `llm_wrapper` library, which offers a versatile wrapper, `llm_func`, designed for seamless interactions with various language learning models (LLMs). The wrapper simplifies model initialization, query execution, and structured output parsing, supporting a wide range of return types including basic data types (`int`, `float`, `str`, `bool`, `set`, `list`, `dict`) and complex Pydantic `BaseModel` structures.\n\n## Getting Started\n```python\nfrom llm_wrapper import llm_func\nfrom langchain_openai import OpenAI\n\n@llm_func\ndef famous_quote() -> str:\n    \"\"\"Returns a famous quote according to the subject provided.\"\"\"\n    pass\n\nllm = OpenAI()\n\nquery = \"Peace and War\"\nquote = famous_quote(llm=llm, query=query)\nprint(quote)  # Output: \"Peace is not a relationship of nations. It is a condition of mind brought about by a serenity of soul. Peace is not merely the absence of war. It is also a state of mind. Lasting peace can come only to peaceful people. - Jawaharlal Nehru\n\n@llm_func\ndef check_grammar() -> float:\n    \"\"\"Check the grammar of the sentence and return a float number between 0 and 1 reflecting its correctness.\"\"\"\n    pass\n\nquery = \"I are a student.\"\ncorrectness = check_grammar(llm=llm, query=query)\nprint(correctness)  # Output: 0.5\nquery = \"I am a student.\"\ncorrectness = check_grammar(llm=llm, query=query)\nprint(correctness)  # Output: 1.0\n```\n### Installation\n\nEnsure the `llm_wrapper` library is installed in your environment. You can install it using pip:\n\n```bash\npip install llm_wrapper\n```\n\n### Importing the Library\n\nStart by importing the necessary components:\n\n```python\nfrom llm_wrapper import llm_func\nfrom pydantic import BaseModel\n```\n\n### Initializing Your LLM Object\n\nYou'll need to instantiate your preferred LLM object. This library is designed to work flexibly with various LLMs:\n\n```python\nllm = YourPreferredLLM()\n```\n\n## Using the `llm_func` Wrapper\n\nThe `llm_func` wrapper is designed to streamline your interaction with LLMs. It automatically handles functions returning basic types (`int`, `float`, `str`, `bool`) or Pydantic `BaseModel` instances.\n\n### Defining Functions with `@llm_func`\n\nAnnotate your functions with `@llm_func` and define a clear return type. Here's how to define functions returning basic types and Pydantic models:\n\n#### Basic Types\n\n```python\n@llm_func\ndef calculate_score() -> int:\n    \"\"\"Returns an integer score based on the input text.\"\"\"\n    pass\n\n@llm_func\ndef is_valid() -> bool:\n    \"\"\"Determines if the text meets certain criteria, returning True or False.\"\"\"\n    pass\n```\n\n#### Pydantic BaseModel\n\n```python\nclass User(BaseModel):\n    name: str\n    age: int\n\n@llm_func\ndef get_user_details() -> User:\n    \"\"\"Extracts user details from the text and returns them as a User model.\"\"\"\n    pass\n```\n\n### Executing Queries\n\nPass your query to the function, along with the instantiated LLM object. The wrapper will process the input and return a structured output based on the defined return type.\n\n```python\nquery = \"Calculate the score for the following text...\"\nscore = calculate_score(llm=llm, query=query)\nprint(score)  # Output will be of type int\n\nquery = \"Check if the following text is valid...\"\nvalidity = is_valid(llm=llm, query=query)\nprint(validity)  # Output will be of type bool\n\nquery = \"Extract user details from the following text...\"\nuser_details = get_user_details(llm=llm, query=query)\nprint(user_details)  # Output will be a User instance\n```\n\n### Support and Development\n\nCurrently, `llm_func` supports functions returning basic data types (`int`, `float`, `str`, `bool`, `set`, `list`, `dict`) and Pydantic `BaseModel` instances. Support for additional types is under active development, and updates will be released periodically to enhance the library's functionality.\n\nBy following these guidelines, you can efficiently use the `llm_wrapper` library to interact with language models, perform queries, and handle structured outputs, all while writing clear and maintainable code.\n\n\n### Documentation\n[llm_wrapper Tutorial](https://github.com/meirm/llm_wrapper/blob/main/docs/tutorial.md)\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "Run llm functions with just inline documentation and no code",
    "version": "0.1.8",
    "project_urls": {
        "Homepage": "https://github.com/meirm/llm_functions"
    },
    "split_keywords": [
        "openai",
        " langchain",
        " pydantic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a1076a60ff3e7dfd458446e79f4a6bb9cf70e918ca42565da883938fe7175999",
                "md5": "f406805a4e4da3015c9aaf4bd54ee02a",
                "sha256": "467343d1fa563300cf2b10707e92eeb44e99745ef7bea952c618dc1ac8389f43"
            },
            "downloads": -1,
            "filename": "llm_wrapper-0.1.8-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f406805a4e4da3015c9aaf4bd54ee02a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 7463,
            "upload_time": "2024-04-30T08:14:51",
            "upload_time_iso_8601": "2024-04-30T08:14:51.402653Z",
            "url": "https://files.pythonhosted.org/packages/a1/07/6a60ff3e7dfd458446e79f4a6bb9cf70e918ca42565da883938fe7175999/llm_wrapper-0.1.8-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "c7f0984e8494a9739cc02532aa86d5bea0bf64439d97b7b32b61dcc0eb90aab5",
                "md5": "9d3a25143eb95c98cc20ad229031a15a",
                "sha256": "406d91eb76fa384e02e2b6d01f848db43c4ab463c00a26de2a5dd285fa79ee02"
            },
            "downloads": -1,
            "filename": "llm_wrapper-0.1.8.tar.gz",
            "has_sig": false,
            "md5_digest": "9d3a25143eb95c98cc20ad229031a15a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 6893,
            "upload_time": "2024-04-30T08:14:52",
            "upload_time_iso_8601": "2024-04-30T08:14:52.493596Z",
            "url": "https://files.pythonhosted.org/packages/c7/f0/984e8494a9739cc02532aa86d5bea0bf64439d97b7b32b61dcc0eb90aab5/llm_wrapper-0.1.8.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-04-30 08:14:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "meirm",
    "github_project": "llm_functions",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "llm_wrapper"
}
        
Elapsed time: 0.25097s