usellm


Nameusellm JSON
Version 0.0.7 PyPI version JSON
download
home_pagehttps://github.com/usellm/usellm-py
SummaryUse Large Language Models in Python App
upload_time2023-06-15 13:44:23
maintainer
docs_urlNone
authorSiddhant
requires_python
license
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # useLLM - Use Large Language Models in Python App

The `usellm` Python library enables interaction with a chat-based Large Language Model (LLM) service. This interaction can be used to perform various language-related tasks, like generating chat conversation using the OpenAI API. It's designed as a Python port of the [`usellm`](https://usellm.org) JavaScript library.

## Installation

The library can be installed with `pip`:

```
pip install usellm
```

## Example Usage

Here is a basic usage example:

```python
from usellm import Message, Options, UseLLM

# Initialize the service
service = UseLLM(service_url="https://usellm.org/api/llm")

# Prepare the conversation
messages = [
  Message(role="system", content="You are a helpful assistant."),
  Message(role="user", content="What can you do for me?"),
]
options = Options(messages=messages)

# Interact with the service
response = service.chat(options)

# Print the assistant's response
print(response.content)
```

The above code will generate a response using the OpenAI ChatGPT API. The service URL "https://usellm.org/api/llm" should be used only for testing.

## Classes and Methods

### 1. `UseLLM` class

The `UseLLM` class provides the interface for interacting with the LLM service.

Methods:
- `__init__(self, service_url: str)`: Initializes a new instance of the `UseLLM` class.
- `chat(self, options: Options) -> Message`: Interacts with the LLM using the provided `Options`, and returns a `Message` instance that represents the LLM's response.

### 2. `Options` class

The `Options` class represents a set of configuration options for a chat interaction with the LLM.

- `messages`: A list of `Message` instances representing the conversation up to the current point.
- `stream`: A boolean indicating if the interaction is a streaming interaction. Note: streaming is currently not supported.
- `template`: A string representing a message template to guide the conversation.
- `inputs`: A dictionary of additional inputs for the conversation.

Methods:
- `__init__(self, messages: Optional[List[Message]] = [], stream: Optional[bool] = None, template: Optional[str] = None, inputs: Optional[dict] = None)`: Initializes a new instance of the `Options` class.

### 3. `Message` class

The `Message` class represents a message in a conversation. It consists of two main attributes:

- `role`: The role of the message sender. Common values could be `system`, `user`, `assistant`.
- `content`: The content of the message.

Methods:
- `__init__(self, role: str, content: str)`: Initializes a new instance of the `Message` class.
- `__repr__(self) -> str`: Returns a string representation of the `Message` instance.
- `__str__(self) -> str`: Returns a string representation of the `Message` instance.
- `to_dict(self) -> dict`: Returns a dictionary representation of the `Message` instance.
- `to_json(self) -> str`: Returns a JSON string representation of the `Message` instance.


## Exceptions

The library raises an `Exception` in the following situations:

- If the `stream` option is set to `True`, because streaming is not currently supported.
- If the HTTP response status code from the LLM service is not 200.
- If the HTTP response from the LLM service contains an "error" field.
- If the HTTP response from the LLM service does not contain a "choices" field.


Please create an issue to report bugs or suggest improvements. Learn more about the original JavaScript library here: https://usellm.org

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/usellm/usellm-py",
    "name": "usellm",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "",
    "author": "Siddhant",
    "author_email": "siddhant@jovian.com",
    "download_url": "https://files.pythonhosted.org/packages/fd/6b/c214151216d75206cf89c42464e249bcf38a86c04289a9ac2c36dc07c47d/usellm-0.0.7.tar.gz",
    "platform": null,
    "description": "# useLLM - Use Large Language Models in Python App\n\nThe `usellm` Python library enables interaction with a chat-based Large Language Model (LLM) service. This interaction can be used to perform various language-related tasks, like generating chat conversation using the OpenAI API. It's designed as a Python port of the [`usellm`](https://usellm.org) JavaScript library.\n\n## Installation\n\nThe library can be installed with `pip`:\n\n```\npip install usellm\n```\n\n## Example Usage\n\nHere is a basic usage example:\n\n```python\nfrom usellm import Message, Options, UseLLM\n\n# Initialize the service\nservice = UseLLM(service_url=\"https://usellm.org/api/llm\")\n\n# Prepare the conversation\nmessages = [\n  Message(role=\"system\", content=\"You are a helpful assistant.\"),\n  Message(role=\"user\", content=\"What can you do for me?\"),\n]\noptions = Options(messages=messages)\n\n# Interact with the service\nresponse = service.chat(options)\n\n# Print the assistant's response\nprint(response.content)\n```\n\nThe above code will generate a response using the OpenAI ChatGPT API. The service URL \"https://usellm.org/api/llm\" should be used only for testing.\n\n## Classes and Methods\n\n### 1. `UseLLM` class\n\nThe `UseLLM` class provides the interface for interacting with the LLM service.\n\nMethods:\n- `__init__(self, service_url: str)`: Initializes a new instance of the `UseLLM` class.\n- `chat(self, options: Options) -> Message`: Interacts with the LLM using the provided `Options`, and returns a `Message` instance that represents the LLM's response.\n\n### 2. `Options` class\n\nThe `Options` class represents a set of configuration options for a chat interaction with the LLM.\n\n- `messages`: A list of `Message` instances representing the conversation up to the current point.\n- `stream`: A boolean indicating if the interaction is a streaming interaction. Note: streaming is currently not supported.\n- `template`: A string representing a message template to guide the conversation.\n- `inputs`: A dictionary of additional inputs for the conversation.\n\nMethods:\n- `__init__(self, messages: Optional[List[Message]] = [], stream: Optional[bool] = None, template: Optional[str] = None, inputs: Optional[dict] = None)`: Initializes a new instance of the `Options` class.\n\n### 3. `Message` class\n\nThe `Message` class represents a message in a conversation. It consists of two main attributes:\n\n- `role`: The role of the message sender. Common values could be `system`, `user`, `assistant`.\n- `content`: The content of the message.\n\nMethods:\n- `__init__(self, role: str, content: str)`: Initializes a new instance of the `Message` class.\n- `__repr__(self) -> str`: Returns a string representation of the `Message` instance.\n- `__str__(self) -> str`: Returns a string representation of the `Message` instance.\n- `to_dict(self) -> dict`: Returns a dictionary representation of the `Message` instance.\n- `to_json(self) -> str`: Returns a JSON string representation of the `Message` instance.\n\n\n## Exceptions\n\nThe library raises an `Exception` in the following situations:\n\n- If the `stream` option is set to `True`, because streaming is not currently supported.\n- If the HTTP response status code from the LLM service is not 200.\n- If the HTTP response from the LLM service contains an \"error\" field.\n- If the HTTP response from the LLM service does not contain a \"choices\" field.\n\n\nPlease create an issue to report bugs or suggest improvements. Learn more about the original JavaScript library here: https://usellm.org\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "Use Large Language Models in Python App",
    "version": "0.0.7",
    "project_urls": {
        "Homepage": "https://github.com/usellm/usellm-py"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "5f2d3ff77e73796c80ff199ec37bf4acdb1b9334f6364d9b4597c15abf3c08c8",
                "md5": "58dced56dfc53ed959e56d4096b341d2",
                "sha256": "6c82c15228bace2f6baacc547aecb74a8b4f60ba18d5975cc667126e7511f9b8"
            },
            "downloads": -1,
            "filename": "usellm-0.0.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "58dced56dfc53ed959e56d4096b341d2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 5457,
            "upload_time": "2023-06-15T13:44:20",
            "upload_time_iso_8601": "2023-06-15T13:44:20.938478Z",
            "url": "https://files.pythonhosted.org/packages/5f/2d/3ff77e73796c80ff199ec37bf4acdb1b9334f6364d9b4597c15abf3c08c8/usellm-0.0.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "fd6bc214151216d75206cf89c42464e249bcf38a86c04289a9ac2c36dc07c47d",
                "md5": "84eede00904ec4e2a1dbc998b22c2d0e",
                "sha256": "008e2f02ae03aa37e149e3d3ab92ec8fd72a335a6aedb7c92d678fc9a7a16cf4"
            },
            "downloads": -1,
            "filename": "usellm-0.0.7.tar.gz",
            "has_sig": false,
            "md5_digest": "84eede00904ec4e2a1dbc998b22c2d0e",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 5140,
            "upload_time": "2023-06-15T13:44:23",
            "upload_time_iso_8601": "2023-06-15T13:44:23.159889Z",
            "url": "https://files.pythonhosted.org/packages/fd/6b/c214151216d75206cf89c42464e249bcf38a86c04289a9ac2c36dc07c47d/usellm-0.0.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-06-15 13:44:23",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "usellm",
    "github_project": "usellm-py",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "usellm"
}
        
Elapsed time: 0.08631s