kscope


Namekscope JSON
Version 0.11.0 PyPI version JSON
download
home_pagehttps://github.com/VectorInstitute/kaleidoscope-sdk
SummaryA user toolkit for analyzing and interfacing with Large Language Models (LLMs)
upload_time2024-07-11 03:02:50
maintainerNone
docs_urlNone
author['Vector AI Engineering']
requires_pythonNone
licenseMIT
keywords python nlp machine-learning deep-learning distributed-computing neural-networks tensor llm
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ![Kaleidoscope](https://user-images.githubusercontent.com/72175053/229659396-2a61cd69-eafa-4a96-8e1c-d93519a8f617.png)
-----------------
# Kaleidoscope-SDK
![PyPI](https://img.shields.io/pypi/v/kscope)
![PyPI - Python Version](https://img.shields.io/pypi/pyversions/kscope)
![GitHub](https://img.shields.io/github/license/VectorInstitute/kaleidoscope-sdk)
![DOI](https://img.shields.io/badge/DOI-in--progress-blue)
[![Documentation](https://img.shields.io/badge/api-reference-lightgrey.svg)](https://kaleidoscope-sdk.readthedocs.io/en/latest/)

A user toolkit for analyzing and interfacing with Large Language Models (LLMs)


## Overview

``kaleidoscope-sdk`` is a Python module used to interact with large language models
hosted via the Kaleidoscope service available at: https://github.com/VectorInstitute/kaleidoscope.
It provides a simple interface to launch LLMs on an HPC cluster and perform basic, fast inference.
These features are exposed via a few high-level APIs, namely:

* `model_instances` - Shows a list of all active LLMs instantiated by the model service
* `load_model` - Loads an LLM via the model service
* `generate` - Returns an LLM text generation based on prompt input, or list of inputs



## Getting Started

Requires Python version >= 3.8

### Install

```bash
python3 -m pip install kscope
```
or install from source:

```bash
pip install git+https://github.com/VectorInstitute/kaleidoscope-sdk.git
```

### Authentication

In order to submit generation jobs, a designated Vector Institute cluster account is required. Please contact the
[AI Engineering Team](mailto:ai_engineering@vectorinstitute.ai?subject=[Github]%20Kaleidoscope)
in charge of Kaleidoscope for more information.

### Sample Workflow

The following workflow shows how to load and interact with an OPT-175B model
on the Vector Institute Vaughan cluster.

```python
#!/usr/bin/env python3
import kscope
import time

# Establish a client connection to the Kaleidoscope service
# If you have not previously authenticated with the service, you will be prompted to now
client = kscope.Client(gateway_host="llm.cluster.local", gateway_port=3001)

# See which models are supported
client.models

# See which models are instantiated and available to use
client.model_instances

# Get a handle to a model. If this model is not actively running, it will get launched in the background.
# In this example we want to use the Llama3 8b model
llama3_model = client.load_model("llama3-8b")

# If the model was not actively running, this it could take several minutes to load. Wait for it come online.
while llama3_model.state != "ACTIVE":
    time.sleep(1)

# Sample text generation w/ input parameters
text_gen = llama3_model.generate("What is Vector Institute?", {'max_tokens': 5, 'top_k': 4, 'temperature': 0.5})
dir(text_gen) # display methods associated with generated text object
text_gen.generation['sequences'] # display only text
text_gen.generation['logprobs'] # display logprobs
text_gen.generation['tokens'] # display tokens

```

## Documentation
Full documentation and API reference are available at: http://kaleidoscope-sdk.readthedocs.io.


## Contributing
Contributing to kaleidoscope is welcomed. See [Contributing](CONTRIBUTING) for
guidelines.


## License
[MIT](LICENSE)


## Citation
Reference to cite when you use Kaleidoscope in a project or a research paper:
```
Willes, J., Choi, M., Coatsworth, M., Shen, G., & Sivaloganathan, J (2022). Kaleidoscope. http://VectorInstitute.github.io/kaleidoscope. computer software, Vector Institute for Artificial Intelligence. Retrieved from https://github.com/VectorInstitute/kaleidoscope-sdk.git.
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/VectorInstitute/kaleidoscope-sdk",
    "name": "kscope",
    "maintainer": null,
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "python nlp machine-learning deep-learning distributed-computing neural-networks tensor llm",
    "author": "['Vector AI Engineering']",
    "author_email": "ai_engineering@vectorinstitute.ai",
    "download_url": "https://files.pythonhosted.org/packages/b0/20/0f329202273e0d0fdd790653bf92117875e9fca45d33af8491e4714324fe/kscope-0.11.0.tar.gz",
    "platform": null,
    "description": "![Kaleidoscope](https://user-images.githubusercontent.com/72175053/229659396-2a61cd69-eafa-4a96-8e1c-d93519a8f617.png)\n-----------------\n# Kaleidoscope-SDK\n![PyPI](https://img.shields.io/pypi/v/kscope)\n![PyPI - Python Version](https://img.shields.io/pypi/pyversions/kscope)\n![GitHub](https://img.shields.io/github/license/VectorInstitute/kaleidoscope-sdk)\n![DOI](https://img.shields.io/badge/DOI-in--progress-blue)\n[![Documentation](https://img.shields.io/badge/api-reference-lightgrey.svg)](https://kaleidoscope-sdk.readthedocs.io/en/latest/)\n\nA user toolkit for analyzing and interfacing with Large Language Models (LLMs)\n\n\n## Overview\n\n``kaleidoscope-sdk`` is a Python module used to interact with large language models\nhosted via the Kaleidoscope service available at: https://github.com/VectorInstitute/kaleidoscope.\nIt provides a simple interface to launch LLMs on an HPC cluster and perform basic, fast inference.\nThese features are exposed via a few high-level APIs, namely:\n\n* `model_instances` - Shows a list of all active LLMs instantiated by the model service\n* `load_model` - Loads an LLM via the model service\n* `generate` - Returns an LLM text generation based on prompt input, or list of inputs\n\n\n\n## Getting Started\n\nRequires Python version >= 3.8\n\n### Install\n\n```bash\npython3 -m pip install kscope\n```\nor install from source:\n\n```bash\npip install git+https://github.com/VectorInstitute/kaleidoscope-sdk.git\n```\n\n### Authentication\n\nIn order to submit generation jobs, a designated Vector Institute cluster account is required. Please contact the\n[AI Engineering Team](mailto:ai_engineering@vectorinstitute.ai?subject=[Github]%20Kaleidoscope)\nin charge of Kaleidoscope for more information.\n\n### Sample Workflow\n\nThe following workflow shows how to load and interact with an OPT-175B model\non the Vector Institute Vaughan cluster.\n\n```python\n#!/usr/bin/env python3\nimport kscope\nimport time\n\n# Establish a client connection to the Kaleidoscope service\n# If you have not previously authenticated with the service, you will be prompted to now\nclient = kscope.Client(gateway_host=\"llm.cluster.local\", gateway_port=3001)\n\n# See which models are supported\nclient.models\n\n# See which models are instantiated and available to use\nclient.model_instances\n\n# Get a handle to a model. If this model is not actively running, it will get launched in the background.\n# In this example we want to use the Llama3 8b model\nllama3_model = client.load_model(\"llama3-8b\")\n\n# If the model was not actively running, this it could take several minutes to load. Wait for it come online.\nwhile llama3_model.state != \"ACTIVE\":\n    time.sleep(1)\n\n# Sample text generation w/ input parameters\ntext_gen = llama3_model.generate(\"What is Vector Institute?\", {'max_tokens': 5, 'top_k': 4, 'temperature': 0.5})\ndir(text_gen) # display methods associated with generated text object\ntext_gen.generation['sequences'] # display only text\ntext_gen.generation['logprobs'] # display logprobs\ntext_gen.generation['tokens'] # display tokens\n\n```\n\n## Documentation\nFull documentation and API reference are available at: http://kaleidoscope-sdk.readthedocs.io.\n\n\n## Contributing\nContributing to kaleidoscope is welcomed. See [Contributing](CONTRIBUTING) for\nguidelines.\n\n\n## License\n[MIT](LICENSE)\n\n\n## Citation\nReference to cite when you use Kaleidoscope in a project or a research paper:\n```\nWilles, J., Choi, M., Coatsworth, M., Shen, G., & Sivaloganathan, J (2022). Kaleidoscope. http://VectorInstitute.github.io/kaleidoscope. computer software, Vector Institute for Artificial Intelligence. Retrieved from https://github.com/VectorInstitute/kaleidoscope-sdk.git.\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A user toolkit for analyzing and interfacing with Large Language Models (LLMs)",
    "version": "0.11.0",
    "project_urls": {
        "Homepage": "https://github.com/VectorInstitute/kaleidoscope-sdk"
    },
    "split_keywords": [
        "python",
        "nlp",
        "machine-learning",
        "deep-learning",
        "distributed-computing",
        "neural-networks",
        "tensor",
        "llm"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "2d5bbd90db7b039186a4b1983bb0b90a92be0b9de6710428fb8eebd09a4fbb20",
                "md5": "97ed17bafb0365a3f2cea19d1f6eacd7",
                "sha256": "8927e24b1899384448cc6456dbb39b9042165db0922882b2d7ef4f951fae5159"
            },
            "downloads": -1,
            "filename": "kscope-0.11.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "97ed17bafb0365a3f2cea19d1f6eacd7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 7761,
            "upload_time": "2024-07-11T03:02:48",
            "upload_time_iso_8601": "2024-07-11T03:02:48.909162Z",
            "url": "https://files.pythonhosted.org/packages/2d/5b/bd90db7b039186a4b1983bb0b90a92be0b9de6710428fb8eebd09a4fbb20/kscope-0.11.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "b0200f329202273e0d0fdd790653bf92117875e9fca45d33af8491e4714324fe",
                "md5": "a82b5fd6e78899720f1544e179cc86be",
                "sha256": "c6d6157a70abb03327f0dea0592a1cd15eb4ecb31ba751f765fdfb9ff3d4e975"
            },
            "downloads": -1,
            "filename": "kscope-0.11.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a82b5fd6e78899720f1544e179cc86be",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 7972,
            "upload_time": "2024-07-11T03:02:50",
            "upload_time_iso_8601": "2024-07-11T03:02:50.752919Z",
            "url": "https://files.pythonhosted.org/packages/b0/20/0f329202273e0d0fdd790653bf92117875e9fca45d33af8491e4714324fe/kscope-0.11.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-07-11 03:02:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "VectorInstitute",
    "github_project": "kaleidoscope-sdk",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "kscope"
}
        
Elapsed time: 0.36191s