lollms-client


Namelollms-client JSON
Version 0.8.2 PyPI version JSON
download
home_pagehttps://github.com/ParisNeo/lollms_client
SummaryA client library for LoLLMs generate endpoint
upload_time2025-02-11 14:06:15
maintainerNone
docs_urlNone
authorParisNeo
requires_pythonNone
licenseNone
keywords
VCS
bugtrack_url
requirements requests ascii-colors pillow pipmaster yaml tiktoken pydantic lollmsvectordb pipmaster numpy
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # lollms_client

[![Python Version](https://img.shields.io/pypi/pyversions/lollms-client)](https://pypi.org/project/lollms-client/) [![PyPI Downloads](https://img.shields.io/pypi/dw/lollms-client)](https://pypi.org/project/lollms-client/) [![Apache License](https://img.shields.io/apachie/2.0)](https://www.apache.org/licenses/LICENSE-2.0)

Welcome to the lollms_client repository! This library is built by [ParisNeo](https://github.com/ParisNeo) and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on [PyPI](https://pypi.org/project/lollms-client/) and distributed under the Apache 2.0 License.

## Installation

To install the library from PyPI using `pip`, run:

```
pip install lollms-client
``` 

## Usage

To use the lollms_client, first import the necessary classes:

```python
from lollms_client import LollmsClient

# Initialize the LollmsClient instance this uses the default lollms localhost service http://localhost:9600
lc = LollmsClient()
# You can also use a different host and port number if you please
lc = LollmsClient("http://some.other.server:9600")
# You can also use a local or remote ollama server
lc = LollmsClient(model_name="mistral-nemo:latest", default_generation_mode = ELF_GENERATION_FORMAT.OLLAMA)
# You can also use a local or remote openai server (you can either set your key as an environment variable or pass it here)
lc = LollmsClient(model_name="gpt-3.5-turbo-0125", default_generation_mode = ELF_GENERATION_FORMAT.OPENAI)
```

### Text Generation

Use `generate()` for generating text from the lollms API.

```python
response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)
```


### List Mounted Personalities (only on lollms)

List mounted personalities of the lollms API with the `listMountedPersonalities()` method.

```python
response = lc.listMountedPersonalities()
print(response)
```

### List Models

List available models of the lollms API with the `listModels()` method.

```python
response = lc.listModels()
print(response)
```

## Complete Example

```python
from lollms_client import LollmsClient

# Initialize the LollmsClient instance
lc = LollmsClient()

# Generate Text
response = lc.generate(prompt="Once upon a time", stream=False, temperature=0.5)
print(response)

# List Mounted Personalities
response = lc.listMountedPersonalities()
print(response)

# List Models
response = lc.listModels()
print(response)
```

Feel free to contribute to the project by submitting issues or pull requests. Follow [ParisNeo](https://github.com/ParisNeo) on [GitHub](https://github.com/ParisNeo), [Twitter](https://twitter.com/ParisNeo_AI), [Discord](https://discord.gg/BDxacQmv), [Sub-Reddit](r/lollms), and [Instagram](https://www.instagram.com/spacenerduino/) for updates and news.

Happy coding!

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/ParisNeo/lollms_client",
    "name": "lollms-client",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": null,
    "author": "ParisNeo",
    "author_email": "parisneoai@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/9a/86/b4fdf09fa51ce1521814d96b7424fc0f829104bdc21397870b25a5998618/lollms_client-0.8.2.tar.gz",
    "platform": null,
    "description": "# lollms_client\r\n\r\n[![Python Version](https://img.shields.io/pypi/pyversions/lollms-client)](https://pypi.org/project/lollms-client/) [![PyPI Downloads](https://img.shields.io/pypi/dw/lollms-client)](https://pypi.org/project/lollms-client/) [![Apache License](https://img.shields.io/apachie/2.0)](https://www.apache.org/licenses/LICENSE-2.0)\r\n\r\nWelcome to the lollms_client repository! This library is built by [ParisNeo](https://github.com/ParisNeo) and provides a convenient way to interact with the lollms (Lord Of Large Language Models) API. It is available on [PyPI](https://pypi.org/project/lollms-client/) and distributed under the Apache 2.0 License.\r\n\r\n## Installation\r\n\r\nTo install the library from PyPI using `pip`, run:\r\n\r\n```\r\npip install lollms-client\r\n``` \r\n\r\n## Usage\r\n\r\nTo use the lollms_client, first import the necessary classes:\r\n\r\n```python\r\nfrom lollms_client import LollmsClient\r\n\r\n# Initialize the LollmsClient instance this uses the default lollms localhost service http://localhost:9600\r\nlc = LollmsClient()\r\n# You can also use a different host and port number if you please\r\nlc = LollmsClient(\"http://some.other.server:9600\")\r\n# You can also use a local or remote ollama server\r\nlc = LollmsClient(model_name=\"mistral-nemo:latest\", default_generation_mode = ELF_GENERATION_FORMAT.OLLAMA)\r\n# You can also use a local or remote openai server (you can either set your key as an environment variable or pass it here)\r\nlc = LollmsClient(model_name=\"gpt-3.5-turbo-0125\", default_generation_mode = ELF_GENERATION_FORMAT.OPENAI)\r\n```\r\n\r\n### Text Generation\r\n\r\nUse `generate()` for generating text from the lollms API.\r\n\r\n```python\r\nresponse = lc.generate(prompt=\"Once upon a time\", stream=False, temperature=0.5)\r\nprint(response)\r\n```\r\n\r\n\r\n### List Mounted Personalities (only on lollms)\r\n\r\nList mounted personalities of the lollms API with the `listMountedPersonalities()` method.\r\n\r\n```python\r\nresponse = lc.listMountedPersonalities()\r\nprint(response)\r\n```\r\n\r\n### List Models\r\n\r\nList available models of the lollms API with the `listModels()` method.\r\n\r\n```python\r\nresponse = lc.listModels()\r\nprint(response)\r\n```\r\n\r\n## Complete Example\r\n\r\n```python\r\nfrom lollms_client import LollmsClient\r\n\r\n# Initialize the LollmsClient instance\r\nlc = LollmsClient()\r\n\r\n# Generate Text\r\nresponse = lc.generate(prompt=\"Once upon a time\", stream=False, temperature=0.5)\r\nprint(response)\r\n\r\n# List Mounted Personalities\r\nresponse = lc.listMountedPersonalities()\r\nprint(response)\r\n\r\n# List Models\r\nresponse = lc.listModels()\r\nprint(response)\r\n```\r\n\r\nFeel free to contribute to the project by submitting issues or pull requests. Follow [ParisNeo](https://github.com/ParisNeo) on [GitHub](https://github.com/ParisNeo), [Twitter](https://twitter.com/ParisNeo_AI), [Discord](https://discord.gg/BDxacQmv), [Sub-Reddit](r/lollms), and [Instagram](https://www.instagram.com/spacenerduino/) for updates and news.\r\n\r\nHappy coding!\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A client library for LoLLMs generate endpoint",
    "version": "0.8.2",
    "project_urls": {
        "Homepage": "https://github.com/ParisNeo/lollms_client"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "337b127c3a8518219b7552c6073bd6c0cf14484595c8350c7a75d60770ae3646",
                "md5": "b7e36ec36bc9cff1897824894b7f53f3",
                "sha256": "9a49f8277630d459fd8d395dc323bba253dece62cb287afdb0e5d6c8c907af9a"
            },
            "downloads": -1,
            "filename": "lollms_client-0.8.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "b7e36ec36bc9cff1897824894b7f53f3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 60782,
            "upload_time": "2025-02-11T14:06:13",
            "upload_time_iso_8601": "2025-02-11T14:06:13.348456Z",
            "url": "https://files.pythonhosted.org/packages/33/7b/127c3a8518219b7552c6073bd6c0cf14484595c8350c7a75d60770ae3646/lollms_client-0.8.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9a86b4fdf09fa51ce1521814d96b7424fc0f829104bdc21397870b25a5998618",
                "md5": "09cdbf233c399c140273b45d6911cee0",
                "sha256": "5ab09cbe4c1d9639045d62e51b7a7f4ab487b06519a8120b403883025eb0d3ec"
            },
            "downloads": -1,
            "filename": "lollms_client-0.8.2.tar.gz",
            "has_sig": false,
            "md5_digest": "09cdbf233c399c140273b45d6911cee0",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 55744,
            "upload_time": "2025-02-11T14:06:15",
            "upload_time_iso_8601": "2025-02-11T14:06:15.757722Z",
            "url": "https://files.pythonhosted.org/packages/9a/86/b4fdf09fa51ce1521814d96b7424fc0f829104bdc21397870b25a5998618/lollms_client-0.8.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-11 14:06:15",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "ParisNeo",
    "github_project": "lollms_client",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "requests",
            "specs": [
                [
                    ">=",
                    "2.25.1"
                ]
            ]
        },
        {
            "name": "ascii-colors",
            "specs": []
        },
        {
            "name": "pillow",
            "specs": []
        },
        {
            "name": "pipmaster",
            "specs": []
        },
        {
            "name": "yaml",
            "specs": []
        },
        {
            "name": "tiktoken",
            "specs": []
        },
        {
            "name": "pydantic",
            "specs": []
        },
        {
            "name": "lollmsvectordb",
            "specs": []
        },
        {
            "name": "pipmaster",
            "specs": []
        },
        {
            "name": "numpy",
            "specs": []
        }
    ],
    "lcname": "lollms-client"
}
        
Elapsed time: 0.41442s