llm-json-adapter


Namellm-json-adapter JSON
Version 0.1.0 PyPI version JSON
download
home_page
Summary
upload_time2024-02-25 06:02:47
maintainer
docs_urlNone
authorTakaaki Mizuno
requires_python>=3.10,<4.0
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLM JSON Adapter

## What is it ?

When using LLMs from the system, you often expect to get output results in JSON: OpenAPI's GPT API has a mechanism called Function Calling, which can return JSON, but Google's Gemini does not seem to have that functionality.

Therefore, I have created a wrapper library to switch LLMs and get results in JSON. What this library can do is as follows.

- Allows you to define the results you want to get in JSON Schema
- Switch between LLMs (currently supports OpenAI's GPT and Google's Gemini).
- Retry a specified number of times if the JSON retrieval fails

## How to use

Use the following code to get the results in JSON.

| Parameter | Description |
| --- | --- |
| provider_name | The name of the LLM provider to use. Currently, only "google" and "openai" are supported. |
| max_retry_count | The number of times to retry if the JSON retrieval fails. |
| attributes | The attributes to pass to the LLM provider. |

### Attributes

| Parameter | Description |
| --- | --- |
| api_key | The API key to use. |


```python
from llm_json_adapter import LLMJsonAdapter, Response

adapter = LLMJsonAdapter(provider_name="google", max_retry_count=3, attributes={
    "api_key": "Your API Key"
})
result = adapter.generate(
    prompt="prompt",
    language="en",
    act_as="Professional Software Service Business Analyst",
    function=Response(
        name="response name",
        description="response description",
        parameters={
            "type": "object",
            "properties": {
                "data": {
                    "type": "array",
                    "items": {
                        "type": "object",
                        "properties": {
                            "title": {
                                "type": "string",
                            },
                            "description": {
                                "type": "string",
                            },
                        },
                        "required": ["title", "description"],
                    },
                },
            },
            "required": ["data"]
        },
    )
)
```




            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "llm-json-adapter",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.10,<4.0",
    "maintainer_email": "",
    "keywords": "",
    "author": "Takaaki Mizuno",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/50/2e/ed13f7fd4f0cf52c6b493279d72fc0608f72daf5f38faca49b0ce3a0d88b/llm_json_adapter-0.1.0.tar.gz",
    "platform": null,
    "description": "# LLM JSON Adapter\n\n## What is it ?\n\nWhen using LLMs from the system, you often expect to get output results in JSON: OpenAPI's GPT API has a mechanism called Function Calling, which can return JSON, but Google's Gemini does not seem to have that functionality.\n\nTherefore, I have created a wrapper library to switch LLMs and get results in JSON. What this library can do is as follows.\n\n- Allows you to define the results you want to get in JSON Schema\n- Switch between LLMs (currently supports OpenAI's GPT and Google's Gemini).\n- Retry a specified number of times if the JSON retrieval fails\n\n## How to use\n\nUse the following code to get the results in JSON.\n\n| Parameter | Description |\n| --- | --- |\n| provider_name | The name of the LLM provider to use. Currently, only \"google\" and \"openai\" are supported. |\n| max_retry_count | The number of times to retry if the JSON retrieval fails. |\n| attributes | The attributes to pass to the LLM provider. |\n\n### Attributes\n\n| Parameter | Description |\n| --- | --- |\n| api_key | The API key to use. |\n\n\n```python\nfrom llm_json_adapter import LLMJsonAdapter, Response\n\nadapter = LLMJsonAdapter(provider_name=\"google\", max_retry_count=3, attributes={\n    \"api_key\": \"Your API Key\"\n})\nresult = adapter.generate(\n    prompt=\"prompt\",\n    language=\"en\",\n    act_as=\"Professional Software Service Business Analyst\",\n    function=Response(\n        name=\"response name\",\n        description=\"response description\",\n        parameters={\n            \"type\": \"object\",\n            \"properties\": {\n                \"data\": {\n                    \"type\": \"array\",\n                    \"items\": {\n                        \"type\": \"object\",\n                        \"properties\": {\n                            \"title\": {\n                                \"type\": \"string\",\n                            },\n                            \"description\": {\n                                \"type\": \"string\",\n                            },\n                        },\n                        \"required\": [\"title\", \"description\"],\n                    },\n                },\n            },\n            \"required\": [\"data\"]\n        },\n    )\n)\n```\n\n\n\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "",
    "version": "0.1.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "d702445ce0b515a5e7a2b6bfe22e020f46c934f169ecab745c45d8205df89d7c",
                "md5": "f7e1116efc0fa112ea9ef7ff57519043",
                "sha256": "6bc8c6a0ee9045bb64abb4e5b7f640536a4644fcbdddc0488ed2229a20f54469"
            },
            "downloads": -1,
            "filename": "llm_json_adapter-0.1.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f7e1116efc0fa112ea9ef7ff57519043",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.10,<4.0",
            "size": 9640,
            "upload_time": "2024-02-25T06:02:45",
            "upload_time_iso_8601": "2024-02-25T06:02:45.408814Z",
            "url": "https://files.pythonhosted.org/packages/d7/02/445ce0b515a5e7a2b6bfe22e020f46c934f169ecab745c45d8205df89d7c/llm_json_adapter-0.1.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "502eed13f7fd4f0cf52c6b493279d72fc0608f72daf5f38faca49b0ce3a0d88b",
                "md5": "a460a052de0327873139c1bcb8923e7d",
                "sha256": "171e213a3200929213f02fe4c6661b7649978a49a162247a13525e7f4d721167"
            },
            "downloads": -1,
            "filename": "llm_json_adapter-0.1.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a460a052de0327873139c1bcb8923e7d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.10,<4.0",
            "size": 6998,
            "upload_time": "2024-02-25T06:02:47",
            "upload_time_iso_8601": "2024-02-25T06:02:47.245785Z",
            "url": "https://files.pythonhosted.org/packages/50/2e/ed13f7fd4f0cf52c6b493279d72fc0608f72daf5f38faca49b0ce3a0d88b/llm_json_adapter-0.1.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-25 06:02:47",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llm-json-adapter"
}
        
Elapsed time: 0.40671s