# lollms_client
[![Python Version](https://img.shields.io/pypi/pyversions/lollms-client)](https://pypi.org/project/lollms-client/) [![PyPI Downloads](https://img.shields.io/pypi/dw/lollms-client)](https://pypi.org/project/lollms-client/) [![Apache License](https://img.shields.io/apachie/2.0)](https://www.apache.org/licenses/LICENSE-2.0)
Welcome to the lollms_client repository! This library is built by [ParisNeo](https://github.com/ParisNeo) and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on [PyPI](https://pypi.org/project/lollms-client/) and distributed under the Apache 2.0 License.
## Installation
To install the library from PyPI using `pip`, run:
```
pip install lollms-client
```
## Usage
To use the lollms_client, first import the necessary classes:
```python
from lollms_client import LollmsClient
# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")
```
### Text Generation
Use `generate_text()` for generating text from the lollms API.
```python
response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
```
### Completion
Use `generate_completion()` for getting a completion of the prompt from the lollms API.
```python
response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)
```
### List Mounted Personalities
List mounted personalities of the lollms API with the `listMountedPersonalities()` method.
```python
response = lc.listMountedPersonalities()
print(response)
```
### List Models
List available models of the lollms API with the `listModels()` method.
```python
response = lc.listModels()
print(response)
```
## Complete Example
```python
from lollms_client import LollmsClient
# Initialize the LollmsClient instance
lc = LollmsClient("http://localhost:9600")
# Generate Text
response = lc.generate_text(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
# Generate Completion
response = lc.generate_completion(prompt="What is the capital of France", stream=False, temperature=0.5)
print(response)
# List Mounted Personalities
response = lc.listMountedPersonalities()
print(response)
# List Models
response = lc.listModels()
print(response)
```
Feel free to contribute to the project by submitting issues or pull requests. Follow [ParisNeo](https://github.com/ParisNeo) on [GitHub](https://github.com/ParisNeo), [Twitter](https://twitter.com/ParisNeo_AI), [Discord](https://discord.gg/BDxacQmv), [Sub-Reddit](r/lollms), and [Instagram](https://www.instagram.com/spacenerduino/) for updates and news.
Happy coding!
Raw data
{
"_id": null,
"home_page": "https://github.com/ParisNeo/lollms_client",
"name": "lollms-client",
"maintainer": null,
"docs_url": null,
"requires_python": null,
"maintainer_email": null,
"keywords": null,
"author": "ParisNeo",
"author_email": "parisneoai@gmail.com",
"download_url": "https://files.pythonhosted.org/packages/7f/83/ea6a67c5d99ea5adbcfea1744b0eae3f6ceebe9e591d19f2eba427fb6637/lollms_client-0.6.1.tar.gz",
"platform": null,
"description": "# lollms_client\r\n\r\n[![Python Version](https://img.shields.io/pypi/pyversions/lollms-client)](https://pypi.org/project/lollms-client/) [![PyPI Downloads](https://img.shields.io/pypi/dw/lollms-client)](https://pypi.org/project/lollms-client/) [![Apache License](https://img.shields.io/apachie/2.0)](https://www.apache.org/licenses/LICENSE-2.0)\r\n\r\nWelcome to the lollms_client repository! This library is built by [ParisNeo](https://github.com/ParisNeo) and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on [PyPI](https://pypi.org/project/lollms-client/) and distributed under the Apache 2.0 License.\r\n\r\n## Installation\r\n\r\nTo install the library from PyPI using `pip`, run:\r\n\r\n```\r\npip install lollms-client\r\n```\r\n\r\n## Usage\r\n\r\nTo use the lollms_client, first import the necessary classes:\r\n\r\n```python\r\nfrom lollms_client import LollmsClient\r\n\r\n# Initialize the LollmsClient instance\r\nlc = LollmsClient(\"http://localhost:9600\")\r\n```\r\n\r\n### Text Generation\r\n\r\nUse `generate_text()` for generating text from the lollms API.\r\n\r\n```python\r\nresponse = lc.generate_text(prompt=\"Once upon a time\", stream=False, temperature=0.5)\r\nprint(response)\r\n```\r\n\r\n### Completion\r\n\r\nUse `generate_completion()` for getting a completion of the prompt from the lollms API.\r\n\r\n```python\r\nresponse = lc.generate_completion(prompt=\"What is the capital of France\", stream=False, temperature=0.5)\r\nprint(response)\r\n```\r\n\r\n### List Mounted Personalities\r\n\r\nList mounted personalities of the lollms API with the `listMountedPersonalities()` method.\r\n\r\n```python\r\nresponse = lc.listMountedPersonalities()\r\nprint(response)\r\n```\r\n\r\n### List Models\r\n\r\nList available models of the lollms API with the `listModels()` method.\r\n\r\n```python\r\nresponse = lc.listModels()\r\nprint(response)\r\n```\r\n\r\n## Complete Example\r\n\r\n```python\r\nfrom lollms_client import LollmsClient\r\n\r\n# Initialize the LollmsClient instance\r\nlc = LollmsClient(\"http://localhost:9600\")\r\n\r\n# Generate Text\r\nresponse = lc.generate_text(prompt=\"Once upon a time\", stream=False, temperature=0.5)\r\nprint(response)\r\n\r\n# Generate Completion\r\nresponse = lc.generate_completion(prompt=\"What is the capital of France\", stream=False, temperature=0.5)\r\nprint(response)\r\n\r\n# List Mounted Personalities\r\nresponse = lc.listMountedPersonalities()\r\nprint(response)\r\n\r\n# List Models\r\nresponse = lc.listModels()\r\nprint(response)\r\n```\r\n\r\nFeel free to contribute to the project by submitting issues or pull requests. Follow [ParisNeo](https://github.com/ParisNeo) on [GitHub](https://github.com/ParisNeo), [Twitter](https://twitter.com/ParisNeo_AI), [Discord](https://discord.gg/BDxacQmv), [Sub-Reddit](r/lollms), and [Instagram](https://www.instagram.com/spacenerduino/) for updates and news.\r\n\r\nHappy coding!\r\n",
"bugtrack_url": null,
"license": null,
"summary": "A client library for LoLLMs generate endpoint",
"version": "0.6.1",
"project_urls": {
"Homepage": "https://github.com/ParisNeo/lollms_client"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "57e7061511cf6ac6a84b127b83582e61031c56975bd06113d1f15e0231647b83",
"md5": "d475ef073b2c4fcac40af9af586ea32a",
"sha256": "6d94c1d3e46daf6c3b590574ccca55496195c91db5a220f233e858bf470c9915"
},
"downloads": -1,
"filename": "lollms_client-0.6.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "d475ef073b2c4fcac40af9af586ea32a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 48007,
"upload_time": "2024-07-19T17:51:54",
"upload_time_iso_8601": "2024-07-19T17:51:54.344432Z",
"url": "https://files.pythonhosted.org/packages/57/e7/061511cf6ac6a84b127b83582e61031c56975bd06113d1f15e0231647b83/lollms_client-0.6.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7f83ea6a67c5d99ea5adbcfea1744b0eae3f6ceebe9e591d19f2eba427fb6637",
"md5": "577e208f9a412765137c1673385f074f",
"sha256": "4fa78206da937ba96d24942b2f11629588d2587b102de4047ba0008440d74b79"
},
"downloads": -1,
"filename": "lollms_client-0.6.1.tar.gz",
"has_sig": false,
"md5_digest": "577e208f9a412765137c1673385f074f",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 43537,
"upload_time": "2024-07-19T17:51:56",
"upload_time_iso_8601": "2024-07-19T17:51:56.117372Z",
"url": "https://files.pythonhosted.org/packages/7f/83/ea6a67c5d99ea5adbcfea1744b0eae3f6ceebe9e591d19f2eba427fb6637/lollms_client-0.6.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-07-19 17:51:56",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "ParisNeo",
"github_project": "lollms_client",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "requests",
"specs": [
[
">=",
"2.25.1"
]
]
},
{
"name": "ascii-colors",
"specs": []
},
{
"name": "safe_store",
"specs": []
},
{
"name": "pillow",
"specs": []
},
{
"name": "pipmaster",
"specs": []
},
{
"name": "yaml",
"specs": []
},
{
"name": "tiktoken",
"specs": []
},
{
"name": "pydantic",
"specs": []
}
],
"lcname": "lollms-client"
}