Name | lang-interface JSON |
Version |
0.0.3
JSON |
| download |
home_page | None |
Summary | Integrate a Python programming interface with an LLM Assistant |
upload_time | 2024-07-04 21:10:01 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >3.11.0 |
license | None |
keywords |
llm
assistant
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# 🔗 🐍 Lang Interface
Super lightweight helper to turn your python interface
into an AI assistant.
## 🚀 Quick start
```python
from dotenv import load_dotenv
from os import environ
from openai import OpenAI
from lang_interface import Assistant
environ['OPENAI_API_KEY'] = '<api_key>'
class MyAPI:
"""Api for managing user's list of contacts (mobile numbers)"""
contacts = {'John': '000-000', 'Bill': '111-111'}
def do_get_contact_list(self, name_starts_with: str = None) -> dict[str, str]:
"""Get contacts names and phones"""
return {
name: phone
for name, phone in self.contacts.items()
if name.startswith(name_starts_with)
}
def do_add_contact(self, name: str, phone: str) -> str:
"""Add new contact"""
if name in self.contacts:
raise Exception(f'Contact with name {name} already exists!')
self.contacts[name] = phone
llm = OpenAI()
api = MyAPI()
assistant = Assistant(api, llm)
print(assistant('Do I have Bob in my contacts?'))
```
### Example interactive mode 💬
```python
def example_chat():
while True:
try:
q = input('\033[1;36m> ')
print('\033[0m', end='')
answer = assistant(q)
print(f'\033[0massistant: {answer}')
except KeyboardInterrupt:
print('\033[0;32mBuy!')
example_chat()
```
## 📝 Basics
Lang Interface uses python **docstrings** and **type hints** to create a short specification
of the programming API for LLM.
The quality of outputs depends on well-structured class, where docstrings are laconic and not ambiguous.
It is recommended to use python typing hits to describe parameters and return values.
If you need to specify complicated input/output format use Pydantic models:
```python
from pydantic import BaseModel
class MyContact(BaseModel):
id: int
name: str
phone_number: str
created_at: datetime
class Interface:
def do_create_contact(self, contact: MyContact):
...
```
However, using dictionaries would still be more reliable, but remember to write a comprehensible docstring.
### LLM
**lang_interface** supports OpenAI client or any callable object:
```python
def call_llm(messages: list[dict]) -> str: ...
```
## 🔒 Security concerns
Giving the API in hands of llm make sure you have all safety checks to prevent from malicious
actions being made by llm generated instructions.
Take a look at this example:
```python
import os
assistant = Assistant(os, openai_client)
```
In this example the whole `os` module is given as an API to LLM, potentially making it possible to call
`rm -rf /` even I LLM was never asked to do so.
Providing an API make sure LLM cannot harm your data or system in any way.
## 💻️ Advanced
### Classes vs Modules
**lang_interface** supports both: python module and a class as an API handler.
For example:
```python
"""My module for managing ..."""
def do_this():...
def do_that():...
```
Or:
```python
class MyAPI:
"""My API class for managing ..."""
def do_this(self):...
def do_that(self):...
```
### Prefixes
By default **lang_interface** scans all public methods/functions.
If you need to specify specific set of methods, use *methods_prefix*:
```python
Assistant(api, llm, methods_prefix='do_')
```
### Debug
Use DEBUG=True, to print all interactions will LLM
```python
import lang_interface
lang_interface.DEBUG = True
```
### Callbacks
You might need to get a callback on every LLM request/response.
You can do that providing a custom callable object as an LLM:
```python
class MyLLM:
def __call__(self, messages: list[dict]) -> str:
resp = openai_client.chat.completions.create(
messages=messages, model='gpt-4o'
)
text = resp.choices[0].message.content
logger.info(f"Response from LLM: {text}")
return text
```
Raw data
{
"_id": null,
"home_page": null,
"name": "lang-interface",
"maintainer": null,
"docs_url": null,
"requires_python": ">3.11.0",
"maintainer_email": null,
"keywords": "llm, assistant",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/60/72/f12595baf2b930a7f65aac921ec8c75a3d569491541d42b339d0384e6edc/lang_interface-0.0.3.tar.gz",
"platform": null,
"description": "# \ud83d\udd17 \ud83d\udc0d Lang Interface\n\nSuper lightweight helper to turn your python interface\ninto an AI assistant.\n\n## \ud83d\ude80 Quick start\n```python\nfrom dotenv import load_dotenv\nfrom os import environ\nfrom openai import OpenAI\n\nfrom lang_interface import Assistant\nenviron['OPENAI_API_KEY'] = '<api_key>'\n\n\nclass MyAPI:\n \"\"\"Api for managing user's list of contacts (mobile numbers)\"\"\"\n contacts = {'John': '000-000', 'Bill': '111-111'}\n\n def do_get_contact_list(self, name_starts_with: str = None) -> dict[str, str]:\n \"\"\"Get contacts names and phones\"\"\"\n return {\n name: phone\n for name, phone in self.contacts.items()\n if name.startswith(name_starts_with)\n }\n\n def do_add_contact(self, name: str, phone: str) -> str:\n \"\"\"Add new contact\"\"\"\n if name in self.contacts:\n raise Exception(f'Contact with name {name} already exists!')\n self.contacts[name] = phone\n \n\nllm = OpenAI()\napi = MyAPI()\nassistant = Assistant(api, llm)\nprint(assistant('Do I have Bob in my contacts?'))\n```\n\n### Example interactive mode \ud83d\udcac\n\n```python\ndef example_chat():\n while True:\n try:\n q = input('\\033[1;36m> ')\n print('\\033[0m', end='')\n answer = assistant(q)\n print(f'\\033[0massistant: {answer}')\n except KeyboardInterrupt:\n print('\\033[0;32mBuy!')\n\n\nexample_chat()\n```\n\n## \ud83d\udcdd Basics\nLang Interface uses python **docstrings** and **type hints** to create a short specification\nof the programming API for LLM.\n\nThe quality of outputs depends on well-structured class, where docstrings are laconic and not ambiguous.\nIt is recommended to use python typing hits to describe parameters and return values.\nIf you need to specify complicated input/output format use Pydantic models:\n```python\nfrom pydantic import BaseModel\n\nclass MyContact(BaseModel):\n id: int\n name: str\n phone_number: str\n created_at: datetime\n\nclass Interface:\n def do_create_contact(self, contact: MyContact):\n ...\n```\nHowever, using dictionaries would still be more reliable, but remember to write a comprehensible docstring.\n\n### LLM\n**lang_interface** supports OpenAI client or any callable object:\n```python\ndef call_llm(messages: list[dict]) -> str: ...\n```\n\n## \ud83d\udd12 Security concerns\nGiving the API in hands of llm make sure you have all safety checks to prevent from malicious\nactions being made by llm generated instructions.\nTake a look at this example:\n```python\nimport os\nassistant = Assistant(os, openai_client)\n```\nIn this example the whole `os` module is given as an API to LLM, potentially making it possible to call\n`rm -rf /` even I LLM was never asked to do so.\nProviding an API make sure LLM cannot harm your data or system in any way.\n\n## \ud83d\udcbb\ufe0f Advanced\n### Classes vs Modules\n**lang_interface** supports both: python module and a class as an API handler.\nFor example:\n```python\n\"\"\"My module for managing ...\"\"\"\ndef do_this():...\ndef do_that():...\n```\nOr:\n```python\nclass MyAPI:\n \"\"\"My API class for managing ...\"\"\"\n def do_this(self):...\n def do_that(self):...\n```\n### Prefixes\nBy default **lang_interface** scans all public methods/functions.\nIf you need to specify specific set of methods, use *methods_prefix*:\n```python\nAssistant(api, llm, methods_prefix='do_')\n```\n\n### Debug\nUse DEBUG=True, to print all interactions will LLM\n```python\nimport lang_interface\nlang_interface.DEBUG = True\n```\n\n### Callbacks\nYou might need to get a callback on every LLM request/response.\nYou can do that providing a custom callable object as an LLM:\n```python\nclass MyLLM:\n def __call__(self, messages: list[dict]) -> str:\n resp = openai_client.chat.completions.create(\n messages=messages, model='gpt-4o'\n )\n text = resp.choices[0].message.content\n logger.info(f\"Response from LLM: {text}\")\n \n return text\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "Integrate a Python programming interface with an LLM Assistant",
"version": "0.0.3",
"project_urls": null,
"split_keywords": [
"llm",
" assistant"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "669e83ad2b67aa291d04ad93f85c223cc5df790e9e4aac86633bde45c3fe1860",
"md5": "048b1ade08a931e47f524f1b3d64749b",
"sha256": "e2d6ba19a23be9ad4b9623e8cc548717b4828814a30419b30a5bee58750f3717"
},
"downloads": -1,
"filename": "lang_interface-0.0.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "048b1ade08a931e47f524f1b3d64749b",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">3.11.0",
"size": 10499,
"upload_time": "2024-07-04T21:09:46",
"upload_time_iso_8601": "2024-07-04T21:09:46.735167Z",
"url": "https://files.pythonhosted.org/packages/66/9e/83ad2b67aa291d04ad93f85c223cc5df790e9e4aac86633bde45c3fe1860/lang_interface-0.0.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "6072f12595baf2b930a7f65aac921ec8c75a3d569491541d42b339d0384e6edc",
"md5": "92beeeb55e5b5cbc293a925df19fb9ca",
"sha256": "aa6caba7b5890ab3ea1ecfca88cbb35786810d2235803e35852db591be6e9080"
},
"downloads": -1,
"filename": "lang_interface-0.0.3.tar.gz",
"has_sig": false,
"md5_digest": "92beeeb55e5b5cbc293a925df19fb9ca",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">3.11.0",
"size": 9900,
"upload_time": "2024-07-04T21:10:01",
"upload_time_iso_8601": "2024-07-04T21:10:01.752468Z",
"url": "https://files.pythonhosted.org/packages/60/72/f12595baf2b930a7f65aac921ec8c75a3d569491541d42b339d0384e6edc/lang_interface-0.0.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-04 21:10:01",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "lang-interface"
}