llmcall


Namellmcall JSON
Version 0.1.0rc1 PyPI version JSON
download
home_pagehttps://github.com/rihoneailabs/llmcall
SummaryA lite abstraction layer for LLM calls
upload_time2025-01-02 14:33:08
maintainerNone
docs_urlNone
authorNdamulelo Nemakhavhani
requires_python<4.0,>=3.11
licenseApache-2.0
keywords llm ai litellm structure-outputs openai pydantic
VCS
bugtrack_url
requirements aiohappyeyeballs aiohttp aiosignal annotated-types anyio attrs certifi charset-normalizer click colorama distro environs filelock frozenlist fsspec h11 httpcore httpx huggingface-hub idna importlib-metadata jinja2 jiter jsonschema-specifications jsonschema litellm markupsafe marshmallow multidict openai packaging propcache pydantic-core pydantic-settings pydantic python-dotenv pyyaml referencing regex requests rpds-py sniffio tenacity tiktoken tokenizers tqdm typing-extensions urllib3 yarl zipp
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLMCall

A lite abstraction layer for LLM calls.

## Motivation

As AI becomes more prevalent in software development, there's a growing need for simple and intuitive APIs for interacting with AI for quick text generation, decision making, and more. This is especially important now that we have structured outputs, which allow us to seamlessly integrate AI into our application flow.

`llmcall` provides a minimal, batteries-included interface for common LLM operations without unnecessary complexity.

## Installation

```bash
pip install llmcall
```

## Example Usage

### Generation

```python
from llmcall import generate, generate_decision
from pydantic import BaseModel

# i. Basic generation
response = generate("Write a story about a fictional holiday to the sun.")

# ii. Structured generation
class ResponseSchema(BaseModel):
    story: str
    tags: list[str]
    
response: ResponseSchema = generate("Create a rare story about the history of civilisation.", output_schema=schema)

# iii. Decision making
decision = generate_decision(
    "Which is bigger?",
    options=["apple", "berry", "pumpkin"]
)
```

### Extraction

```python
from llmcall import extract
from pydantic import BaseModel

class ResponseSchema(BaseModel):
    email_subject: str
    email_body: str
    email_topic: str
    email_sentiment: str

text = """To whom it may concern,

Request for Admission at Harvard University

I write to plead with the admission board to consider my application for the 2022/2023 academic year. I am a dedicated student with a passion for computer science and a strong desire to make a difference in the world. I believe that Harvard University is the perfect place for me to achieve my dreams and make a positive impact on society."""

response: ResponseSchema = extract(text=text, output_schema=ResponseSchema)
```

## Configuration

Set environment variables:
- LLMCALL_API_KEY: Your API key
- LLMCALL_MODEL: Model to use (default: `openai/gpt-4o-2024-08-06`)

> **Note**: We recommend using `Open AI` as the model provider due to their robust support for structured outputs. You can use other providers by setting the `LLMCALL_MODEL` or changing the [config](./llmcall/core.py) directly. Any model supported by `LiteLLM` can be used.

## Roadmap

- [x] Simple API for generating unstructured text
- [x] Structured output generation using `Pydantic`
- [x] Decision making
- [x] Custom model selection (via `LiteLLM` - See [documentation](https://docs.litellm.ai/docs/providers))
- [x] Structured text extraction
- [ ] Structured text extraction from PDF, Docx, etc.
- [ ] Structured text extraction from Images
- [ ] Structured text extraction from Websites

## Documentation

Please refer to our comprehensive [documentation](./docs/index.md) to learn more about this tool.


            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/rihoneailabs/llmcall",
    "name": "llmcall",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.11",
    "maintainer_email": null,
    "keywords": "llm, ai, litellm, structure-outputs, openai, pydantic",
    "author": "Ndamulelo Nemakhavhani",
    "author_email": "info@rihonegroup.com",
    "download_url": "https://files.pythonhosted.org/packages/e0/85/9bed64113a62e3b79add55fdb3df9cc6ea92dfacf3537580c17166bdf0aa/llmcall-0.1.0rc1.tar.gz",
    "platform": null,
    "description": "# LLMCall\n\nA lite abstraction layer for LLM calls.\n\n## Motivation\n\nAs AI becomes more prevalent in software development, there's a growing need for simple and intuitive APIs for interacting with AI for quick text generation, decision making, and more. This is especially important now that we have structured outputs, which allow us to seamlessly integrate AI into our application flow.\n\n`llmcall` provides a minimal, batteries-included interface for common LLM operations without unnecessary complexity.\n\n## Installation\n\n```bash\npip install llmcall\n```\n\n## Example Usage\n\n### Generation\n\n```python\nfrom llmcall import generate, generate_decision\nfrom pydantic import BaseModel\n\n# i. Basic generation\nresponse = generate(\"Write a story about a fictional holiday to the sun.\")\n\n# ii. Structured generation\nclass ResponseSchema(BaseModel):\n    story: str\n    tags: list[str]\n    \nresponse: ResponseSchema = generate(\"Create a rare story about the history of civilisation.\", output_schema=schema)\n\n# iii. Decision making\ndecision = generate_decision(\n    \"Which is bigger?\",\n    options=[\"apple\", \"berry\", \"pumpkin\"]\n)\n```\n\n### Extraction\n\n```python\nfrom llmcall import extract\nfrom pydantic import BaseModel\n\nclass ResponseSchema(BaseModel):\n    email_subject: str\n    email_body: str\n    email_topic: str\n    email_sentiment: str\n\ntext = \"\"\"To whom it may concern,\n\nRequest for Admission at Harvard University\n\nI write to plead with the admission board to consider my application for the 2022/2023 academic year. I am a dedicated student with a passion for computer science and a strong desire to make a difference in the world. I believe that Harvard University is the perfect place for me to achieve my dreams and make a positive impact on society.\"\"\"\n\nresponse: ResponseSchema = extract(text=text, output_schema=ResponseSchema)\n```\n\n## Configuration\n\nSet environment variables:\n- LLMCALL_API_KEY: Your API key\n- LLMCALL_MODEL: Model to use (default: `openai/gpt-4o-2024-08-06`)\n\n> **Note**: We recommend using `Open AI` as the model provider due to their robust support for structured outputs. You can use other providers by setting the `LLMCALL_MODEL` or changing the [config](./llmcall/core.py) directly. Any model supported by `LiteLLM` can be used.\n\n## Roadmap\n\n- [x] Simple API for generating unstructured text\n- [x] Structured output generation using `Pydantic`\n- [x] Decision making\n- [x] Custom model selection (via `LiteLLM` - See [documentation](https://docs.litellm.ai/docs/providers))\n- [x] Structured text extraction\n- [ ] Structured text extraction from PDF, Docx, etc.\n- [ ] Structured text extraction from Images\n- [ ] Structured text extraction from Websites\n\n## Documentation\n\nPlease refer to our comprehensive [documentation](./docs/index.md) to learn more about this tool.\n\n",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "A lite abstraction layer for LLM calls",
    "version": "0.1.0rc1",
    "project_urls": {
        "Homepage": "https://github.com/rihoneailabs/llmcall",
        "Repository": "https://github.com/rihoneailabs/llmcall.git"
    },
    "split_keywords": [
        "llm",
        " ai",
        " litellm",
        " structure-outputs",
        " openai",
        " pydantic"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "4a01dade7a68c5e880735f770811e92e74a322dbe7449f74411a7e5df5fa2a8e",
                "md5": "2bac8e4fcecec222acd1b32cdeeb0f41",
                "sha256": "e32a2ffd1180950af03cbd9f8cf9750aa74b83ea54f4b95962bc3d7ec4b144fb"
            },
            "downloads": -1,
            "filename": "llmcall-0.1.0rc1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2bac8e4fcecec222acd1b32cdeeb0f41",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.11",
            "size": 5975,
            "upload_time": "2025-01-02T14:33:05",
            "upload_time_iso_8601": "2025-01-02T14:33:05.431070Z",
            "url": "https://files.pythonhosted.org/packages/4a/01/dade7a68c5e880735f770811e92e74a322dbe7449f74411a7e5df5fa2a8e/llmcall-0.1.0rc1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e0859bed64113a62e3b79add55fdb3df9cc6ea92dfacf3537580c17166bdf0aa",
                "md5": "63d607071ae252bbfddc8abcc2cc3c73",
                "sha256": "5ebe6e619c7f8f6c3d6aaf9b203487399638f0ef6480ce82e715a4928d80f031"
            },
            "downloads": -1,
            "filename": "llmcall-0.1.0rc1.tar.gz",
            "has_sig": false,
            "md5_digest": "63d607071ae252bbfddc8abcc2cc3c73",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.11",
            "size": 4543,
            "upload_time": "2025-01-02T14:33:08",
            "upload_time_iso_8601": "2025-01-02T14:33:08.033890Z",
            "url": "https://files.pythonhosted.org/packages/e0/85/9bed64113a62e3b79add55fdb3df9cc6ea92dfacf3537580c17166bdf0aa/llmcall-0.1.0rc1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-01-02 14:33:08",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "rihoneailabs",
    "github_project": "llmcall",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "aiohappyeyeballs",
            "specs": [
                [
                    "==",
                    "2.4.4"
                ]
            ]
        },
        {
            "name": "aiohttp",
            "specs": [
                [
                    "==",
                    "3.11.11"
                ]
            ]
        },
        {
            "name": "aiosignal",
            "specs": [
                [
                    "==",
                    "1.3.2"
                ]
            ]
        },
        {
            "name": "annotated-types",
            "specs": [
                [
                    "==",
                    "0.7.0"
                ]
            ]
        },
        {
            "name": "anyio",
            "specs": [
                [
                    "==",
                    "4.7.0"
                ]
            ]
        },
        {
            "name": "attrs",
            "specs": [
                [
                    "==",
                    "24.3.0"
                ]
            ]
        },
        {
            "name": "certifi",
            "specs": [
                [
                    "==",
                    "2024.12.14"
                ]
            ]
        },
        {
            "name": "charset-normalizer",
            "specs": [
                [
                    "==",
                    "3.4.1"
                ]
            ]
        },
        {
            "name": "click",
            "specs": [
                [
                    "==",
                    "8.1.8"
                ]
            ]
        },
        {
            "name": "colorama",
            "specs": [
                [
                    "==",
                    "0.4.6"
                ]
            ]
        },
        {
            "name": "distro",
            "specs": [
                [
                    "==",
                    "1.9.0"
                ]
            ]
        },
        {
            "name": "environs",
            "specs": [
                [
                    "==",
                    "11.2.1"
                ]
            ]
        },
        {
            "name": "filelock",
            "specs": [
                [
                    "==",
                    "3.16.1"
                ]
            ]
        },
        {
            "name": "frozenlist",
            "specs": [
                [
                    "==",
                    "1.5.0"
                ]
            ]
        },
        {
            "name": "fsspec",
            "specs": [
                [
                    "==",
                    "2024.12.0"
                ]
            ]
        },
        {
            "name": "h11",
            "specs": [
                [
                    "==",
                    "0.14.0"
                ]
            ]
        },
        {
            "name": "httpcore",
            "specs": [
                [
                    "==",
                    "1.0.7"
                ]
            ]
        },
        {
            "name": "httpx",
            "specs": [
                [
                    "==",
                    "0.27.2"
                ]
            ]
        },
        {
            "name": "huggingface-hub",
            "specs": [
                [
                    "==",
                    "0.27.0"
                ]
            ]
        },
        {
            "name": "idna",
            "specs": [
                [
                    "==",
                    "3.10"
                ]
            ]
        },
        {
            "name": "importlib-metadata",
            "specs": [
                [
                    "==",
                    "8.5.0"
                ]
            ]
        },
        {
            "name": "jinja2",
            "specs": [
                [
                    "==",
                    "3.1.5"
                ]
            ]
        },
        {
            "name": "jiter",
            "specs": [
                [
                    "==",
                    "0.8.2"
                ]
            ]
        },
        {
            "name": "jsonschema-specifications",
            "specs": [
                [
                    "==",
                    "2024.10.1"
                ]
            ]
        },
        {
            "name": "jsonschema",
            "specs": [
                [
                    "==",
                    "4.23.0"
                ]
            ]
        },
        {
            "name": "litellm",
            "specs": [
                [
                    "==",
                    "1.56.6"
                ]
            ]
        },
        {
            "name": "markupsafe",
            "specs": [
                [
                    "==",
                    "3.0.2"
                ]
            ]
        },
        {
            "name": "marshmallow",
            "specs": [
                [
                    "==",
                    "3.23.2"
                ]
            ]
        },
        {
            "name": "multidict",
            "specs": [
                [
                    "==",
                    "6.1.0"
                ]
            ]
        },
        {
            "name": "openai",
            "specs": [
                [
                    "==",
                    "1.58.1"
                ]
            ]
        },
        {
            "name": "packaging",
            "specs": [
                [
                    "==",
                    "24.2"
                ]
            ]
        },
        {
            "name": "propcache",
            "specs": [
                [
                    "==",
                    "0.2.1"
                ]
            ]
        },
        {
            "name": "pydantic-core",
            "specs": [
                [
                    "==",
                    "2.27.2"
                ]
            ]
        },
        {
            "name": "pydantic-settings",
            "specs": [
                [
                    "==",
                    "2.7.1"
                ]
            ]
        },
        {
            "name": "pydantic",
            "specs": [
                [
                    "==",
                    "2.10.4"
                ]
            ]
        },
        {
            "name": "python-dotenv",
            "specs": [
                [
                    "==",
                    "1.0.1"
                ]
            ]
        },
        {
            "name": "pyyaml",
            "specs": [
                [
                    "==",
                    "6.0.2"
                ]
            ]
        },
        {
            "name": "referencing",
            "specs": [
                [
                    "==",
                    "0.35.1"
                ]
            ]
        },
        {
            "name": "regex",
            "specs": [
                [
                    "==",
                    "2024.11.6"
                ]
            ]
        },
        {
            "name": "requests",
            "specs": [
                [
                    "==",
                    "2.32.3"
                ]
            ]
        },
        {
            "name": "rpds-py",
            "specs": [
                [
                    "==",
                    "0.22.3"
                ]
            ]
        },
        {
            "name": "sniffio",
            "specs": [
                [
                    "==",
                    "1.3.1"
                ]
            ]
        },
        {
            "name": "tenacity",
            "specs": [
                [
                    "==",
                    "9.0.0"
                ]
            ]
        },
        {
            "name": "tiktoken",
            "specs": [
                [
                    "==",
                    "0.8.0"
                ]
            ]
        },
        {
            "name": "tokenizers",
            "specs": [
                [
                    "==",
                    "0.21.0"
                ]
            ]
        },
        {
            "name": "tqdm",
            "specs": [
                [
                    "==",
                    "4.67.1"
                ]
            ]
        },
        {
            "name": "typing-extensions",
            "specs": [
                [
                    "==",
                    "4.12.2"
                ]
            ]
        },
        {
            "name": "urllib3",
            "specs": [
                [
                    "==",
                    "2.3.0"
                ]
            ]
        },
        {
            "name": "yarl",
            "specs": [
                [
                    "==",
                    "1.18.3"
                ]
            ]
        },
        {
            "name": "zipp",
            "specs": [
                [
                    "==",
                    "3.21.0"
                ]
            ]
        }
    ],
    "tox": true,
    "lcname": "llmcall"
}
        
Elapsed time: 1.04470s