Name | LiteLLMJson JSON |
Version |
0.0.7
JSON |
| download |
home_page | https://github.com/HawkClaws/lite_llm_json |
Summary | This library offers functionality to cleanly extract JSON from LLM responses and generate prompts for LLM that return JSON. It features a simple implementation while maintaining high versatility. |
upload_time | 2024-04-18 07:24:10 |
maintainer | None |
docs_url | None |
author | HawkClaws |
requires_python | >=3.6 |
license | MIT |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# lite_llm_json
## Description
The library has functionality to cleanly extract JSON from LLM responses and generate prompts for LLM that return JSON. It features a simple yet versatile implementation.
## Installation
`pip install LiteLLMJson`
## HowToUse
```python
import openai
from lite_llm_json import LiteLLMJson
# Set the OpenAI API key
api_key = "YOUR_API_KEY"
openai.api_key = api_key
# Define the JSON schema
json_schema = {
"type": "object",
"properties": {"name": {"type": "string"}, "age": {"type": "integer"}},
"required": ["name", "age"],
}
# Instantiate the LiteLLMJson class
llm_json = LiteLLMJson(json_schema)
# Define the query prompt
query_prompt = """## Instructions:
Provide information about a person."""
# Get the generated prompt
generated_prompt = llm_json.generate_prompt(query_prompt)
# Use the OpenAI API to get a completion
response = openai.Completion.create(
engine="text-davinci-003",
prompt=generated_prompt
)
# Get the output text
output_text = response["choices"][0]["text"]
# Parse the output text to obtain JSON data
json_data = llm_json.parse_response(output_text)
# Display the JSON data
print(json_data)
```
Raw data
{
"_id": null,
"home_page": "https://github.com/HawkClaws/lite_llm_json",
"name": "LiteLLMJson",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.6",
"maintainer_email": null,
"keywords": null,
"author": "HawkClaws",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/f5/a8/29ed2502bf53da4f445aebed34f54df1c8736604def96ca085f257ca6822/LiteLLMJson-0.0.7.tar.gz",
"platform": null,
"description": "# lite_llm_json\r\n\r\n## Description\r\n\r\nThe library has functionality to cleanly extract JSON from LLM responses and generate prompts for LLM that return JSON. It features a simple yet versatile implementation.\r\n\r\n## Installation\r\n\r\n`pip install LiteLLMJson`\r\n\r\n## HowToUse\r\n\r\n```python\r\nimport openai\r\nfrom lite_llm_json import LiteLLMJson\r\n\r\n# Set the OpenAI API key\r\napi_key = \"YOUR_API_KEY\"\r\nopenai.api_key = api_key\r\n\r\n# Define the JSON schema\r\njson_schema = {\r\n \"type\": \"object\",\r\n \"properties\": {\"name\": {\"type\": \"string\"}, \"age\": {\"type\": \"integer\"}},\r\n \"required\": [\"name\", \"age\"],\r\n}\r\n# Instantiate the LiteLLMJson class\r\nllm_json = LiteLLMJson(json_schema)\r\n\r\n# Define the query prompt\r\nquery_prompt = \"\"\"## Instructions:\r\nProvide information about a person.\"\"\"\r\n\r\n# Get the generated prompt\r\ngenerated_prompt = llm_json.generate_prompt(query_prompt)\r\n\r\n# Use the OpenAI API to get a completion\r\nresponse = openai.Completion.create(\r\n engine=\"text-davinci-003\",\r\n prompt=generated_prompt\r\n)\r\n\r\n# Get the output text\r\noutput_text = response[\"choices\"][0][\"text\"]\r\n\r\n# Parse the output text to obtain JSON data\r\njson_data = llm_json.parse_response(output_text)\r\n\r\n# Display the JSON data\r\nprint(json_data)\r\n\r\n\r\n```\r\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "This library offers functionality to cleanly extract JSON from LLM responses and generate prompts for LLM that return JSON. It features a simple implementation while maintaining high versatility.",
"version": "0.0.7",
"project_urls": {
"Homepage": "https://github.com/HawkClaws/lite_llm_json",
"Source Code": "https://github.com/HawkClaws/lite_llm_json"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "12e45d3aa8931d5aa687a3db18227835bcacb602a0fdc586de6b2b86289d0c15",
"md5": "a4395c5519daa3a8e686a61ee7e38c51",
"sha256": "77ce0120883df8b3ccbf49820fdb883c657922d76368361be4f7f785ee125b1c"
},
"downloads": -1,
"filename": "LiteLLMJson-0.0.7-py3-none-any.whl",
"has_sig": false,
"md5_digest": "a4395c5519daa3a8e686a61ee7e38c51",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.6",
"size": 4221,
"upload_time": "2024-04-18T07:24:08",
"upload_time_iso_8601": "2024-04-18T07:24:08.834879Z",
"url": "https://files.pythonhosted.org/packages/12/e4/5d3aa8931d5aa687a3db18227835bcacb602a0fdc586de6b2b86289d0c15/LiteLLMJson-0.0.7-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "f5a829ed2502bf53da4f445aebed34f54df1c8736604def96ca085f257ca6822",
"md5": "74847bcc102920673949bdb3766b1d92",
"sha256": "c7dfb9e7020d1aa571343c83ccab4b8a438b59b21276b1b0499ef21593df94d8"
},
"downloads": -1,
"filename": "LiteLLMJson-0.0.7.tar.gz",
"has_sig": false,
"md5_digest": "74847bcc102920673949bdb3766b1d92",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.6",
"size": 3632,
"upload_time": "2024-04-18T07:24:10",
"upload_time_iso_8601": "2024-04-18T07:24:10.473809Z",
"url": "https://files.pythonhosted.org/packages/f5/a8/29ed2502bf53da4f445aebed34f54df1c8736604def96ca085f257ca6822/LiteLLMJson-0.0.7.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-04-18 07:24:10",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "HawkClaws",
"github_project": "lite_llm_json",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "litellmjson"
}