openllm-core


Nameopenllm-core JSON
Version 0.4.44 PyPI version JSON
download
home_page
SummaryOpenLLM Core: Core components for OpenLLM.
upload_time2024-02-06 03:08:52
maintainer
docs_urlNone
author
requires_python>=3.8
license
keywords ai alpaca bentoml falcon fine tuning generative ai llmops large language model llama 2 mlops model deployment model serving pytorch serverless stablelm transformers vicuna
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <p align="center">
  <a href="https://github.com/bentoml/openllm">
    <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/main-banner.png" alt="Banner for OpenLLM" />
  </a>
</p>


<div align="center">
    <h1 align="center">🦑 OpenLLM Core</h1>
    <a href="https://pypi.org/project/openllm-core">
        <img src="https://img.shields.io/pypi/v/openllm-core.svg?logo=pypi&label=PyPI&logoColor=gold" alt="pypi_status" />
    </a><a href="https://test.pypi.org/project/openllm-core/">
        <img src="https://img.shields.io/badge/Nightly-PyPI?logo=pypi&label=PyPI&color=gray&link=https%3A%2F%2Ftest.pypi.org%2Fproject%2Fopenllm%2F" alt="test_pypi_status" />
    </a><a href="https://twitter.com/bentomlai">
        <img src="https://badgen.net/badge/icon/@bentomlai/1DA1F2?icon=twitter&label=Follow%20Us" alt="Twitter" />
    </a><a href="https://l.bentoml.com/join-openllm-discord">
        <img src="https://badgen.net/badge/icon/OpenLLM/7289da?icon=discord&label=Join%20Us" alt="Discord" />
    </a><a href="https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml">
        <img src="https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml/badge.svg?branch=main" alt="ci" />
    </a><a href="https://results.pre-commit.ci/latest/github/bentoml/OpenLLM/main">
        <img src="https://results.pre-commit.ci/badge/github/bentoml/OpenLLM/main.svg" alt="pre-commit.ci status" />
    </a><br>
    <a href="https://pypi.org/project/openllm-core">
        <img src="https://img.shields.io/pypi/pyversions/openllm-core.svg?logo=python&label=Python&logoColor=gold" alt="python_version" />
    </a><a href="htjtps://github.com/pypa/hatch">
        <img src="https://img.shields.io/badge/%F0%9F%A5%9A-Hatch-4051b5.svg" alt="Hatch" />
    </a><a href="https://github.com/bentoml/OpenLLM/blob/main/STYLE.md">
        <img src="https://img.shields.io/badge/code%20style-experimental-000000.svg" alt="code style" />
    </a><a href="https://github.com/astral-sh/ruff">
        <img src="https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json" alt="Ruff" />
    </a><a href="https://github.com/python/mypy">
        <img src="https://img.shields.io/badge/types-mypy-blue.svg" alt="types - mypy" />
    </a><a href="https://github.com/microsoft/pyright">
        <img src="https://img.shields.io/badge/types-pyright-yellow.svg" alt="types - pyright" />
    </a><br>
    <p>OpenLLM Core: Core components for OpenLLM.<br/></p>
    <i></i>
</div>

## 📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models,
deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit <a href="https://github.com/bentoml/OpenLLM">OpenLLM's README.md</a>

This package holds the core components of OpenLLM, and considered as internal.

Components includes:

- Configuration generation.
- Utilities for interacting with OpenLLM server.
- Schema and generation utilities for OpenLLM server.

<p align="center">
  <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/output.gif" alt="Gif showing OpenLLM Intro" />
</p>

<p align="center">
  <img src="https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/agent.gif" alt="Gif showing Agent integration" />
</p>

## 📔 Citation

If you use OpenLLM in your research, we provide a [citation](../CITATION.cff) to use:

```bibtex
@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}
```

---

[Click me for full changelog](https://github.com/bentoml/openllm/blob/main/CHANGELOG.md)

            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "openllm-core",
    "maintainer": "",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "",
    "keywords": "AI,Alpaca,BentoML,Falcon,Fine tuning,Generative AI,LLMOps,Large Language Model,Llama 2,MLOps,Model Deployment,Model Serving,PyTorch,Serverless,StableLM,Transformers,Vicuna",
    "author": "",
    "author_email": "Aaron Pham <aarnphm@bentoml.com>, BentoML Team <contact@bentoml.com>",
    "download_url": "https://files.pythonhosted.org/packages/64/04/db874982c94d7f371c589c6c1cbe7867ca3ac0887645d05aa6ab7a5a0ec4/openllm_core-0.4.44.tar.gz",
    "platform": null,
    "description": "<p align=\"center\">\n  <a href=\"https://github.com/bentoml/openllm\">\n    <img src=\"https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/main-banner.png\" alt=\"Banner for OpenLLM\" />\n  </a>\n</p>\n\n\n<div align=\"center\">\n    <h1 align=\"center\">\ud83e\udd91 OpenLLM Core</h1>\n    <a href=\"https://pypi.org/project/openllm-core\">\n        <img src=\"https://img.shields.io/pypi/v/openllm-core.svg?logo=pypi&label=PyPI&logoColor=gold\" alt=\"pypi_status\" />\n    </a><a href=\"https://test.pypi.org/project/openllm-core/\">\n        <img src=\"https://img.shields.io/badge/Nightly-PyPI?logo=pypi&label=PyPI&color=gray&link=https%3A%2F%2Ftest.pypi.org%2Fproject%2Fopenllm%2F\" alt=\"test_pypi_status\" />\n    </a><a href=\"https://twitter.com/bentomlai\">\n        <img src=\"https://badgen.net/badge/icon/@bentomlai/1DA1F2?icon=twitter&label=Follow%20Us\" alt=\"Twitter\" />\n    </a><a href=\"https://l.bentoml.com/join-openllm-discord\">\n        <img src=\"https://badgen.net/badge/icon/OpenLLM/7289da?icon=discord&label=Join%20Us\" alt=\"Discord\" />\n    </a><a href=\"https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml\">\n        <img src=\"https://github.com/bentoml/OpenLLM/actions/workflows/ci.yml/badge.svg?branch=main\" alt=\"ci\" />\n    </a><a href=\"https://results.pre-commit.ci/latest/github/bentoml/OpenLLM/main\">\n        <img src=\"https://results.pre-commit.ci/badge/github/bentoml/OpenLLM/main.svg\" alt=\"pre-commit.ci status\" />\n    </a><br>\n    <a href=\"https://pypi.org/project/openllm-core\">\n        <img src=\"https://img.shields.io/pypi/pyversions/openllm-core.svg?logo=python&label=Python&logoColor=gold\" alt=\"python_version\" />\n    </a><a href=\"htjtps://github.com/pypa/hatch\">\n        <img src=\"https://img.shields.io/badge/%F0%9F%A5%9A-Hatch-4051b5.svg\" alt=\"Hatch\" />\n    </a><a href=\"https://github.com/bentoml/OpenLLM/blob/main/STYLE.md\">\n        <img src=\"https://img.shields.io/badge/code%20style-experimental-000000.svg\" alt=\"code style\" />\n    </a><a href=\"https://github.com/astral-sh/ruff\">\n        <img src=\"https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json\" alt=\"Ruff\" />\n    </a><a href=\"https://github.com/python/mypy\">\n        <img src=\"https://img.shields.io/badge/types-mypy-blue.svg\" alt=\"types - mypy\" />\n    </a><a href=\"https://github.com/microsoft/pyright\">\n        <img src=\"https://img.shields.io/badge/types-pyright-yellow.svg\" alt=\"types - pyright\" />\n    </a><br>\n    <p>OpenLLM Core: Core components for OpenLLM.<br/></p>\n    <i></i>\n</div>\n\n## \ud83d\udcd6 Introduction\n\nWith OpenLLM, you can run inference with any open-source large-language models,\ndeploy to the cloud or on-premises, and build powerful AI apps, and more.\n\nTo learn more about OpenLLM, please visit <a href=\"https://github.com/bentoml/OpenLLM\">OpenLLM's README.md</a>\n\nThis package holds the core components of OpenLLM, and considered as internal.\n\nComponents includes:\n\n- Configuration generation.\n- Utilities for interacting with OpenLLM server.\n- Schema and generation utilities for OpenLLM server.\n\n<p align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/output.gif\" alt=\"Gif showing OpenLLM Intro\" />\n</p>\n\n<p align=\"center\">\n  <img src=\"https://raw.githubusercontent.com/bentoml/openllm/main/.github/assets/agent.gif\" alt=\"Gif showing Agent integration\" />\n</p>\n\n## \ud83d\udcd4 Citation\n\nIf you use OpenLLM in your research, we provide a [citation](../CITATION.cff) to use:\n\n```bibtex\n@software{Pham_OpenLLM_Operating_LLMs_2023,\nauthor = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},\nlicense = {Apache-2.0},\nmonth = jun,\ntitle = {{OpenLLM: Operating LLMs in production}},\nurl = {https://github.com/bentoml/OpenLLM},\nyear = {2023}\n}\n```\n\n---\n\n[Click me for full changelog](https://github.com/bentoml/openllm/blob/main/CHANGELOG.md)\n",
    "bugtrack_url": null,
    "license": "",
    "summary": "OpenLLM Core: Core components for OpenLLM.",
    "version": "0.4.44",
    "project_urls": {
        "Blog": "https://modelserving.com",
        "Chat": "https://l.bentoml.com/join-openllm-discord",
        "Documentation": "https://github.com/bentoml/OpenLLM/blob/main/openllm-core/README.md",
        "GitHub": "https://github.com/bentoml/OpenLLM/blob/main/openllm-core",
        "History": "https://github.com/bentoml/OpenLLM/blob/main/CHANGELOG.md",
        "Homepage": "https://bentoml.com",
        "Tracker": "https://github.com/bentoml/OpenLLM/issues",
        "Twitter": "https://twitter.com/bentomlai"
    },
    "split_keywords": [
        "ai",
        "alpaca",
        "bentoml",
        "falcon",
        "fine tuning",
        "generative ai",
        "llmops",
        "large language model",
        "llama 2",
        "mlops",
        "model deployment",
        "model serving",
        "pytorch",
        "serverless",
        "stablelm",
        "transformers",
        "vicuna"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9c64eb33d7c6da397dd89d9dfa3e12b14f0e3716b0c31b0ed9f4848f178eca3a",
                "md5": "726c1b68c84f9e039f2e0339946b93ff",
                "sha256": "7ee775980b4864b20906a7a6866e022e04ba8103977f88f30c259af4216b9be6"
            },
            "downloads": -1,
            "filename": "openllm_core-0.4.44-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "726c1b68c84f9e039f2e0339946b93ff",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 72677,
            "upload_time": "2024-02-06T03:08:47",
            "upload_time_iso_8601": "2024-02-06T03:08:47.290859Z",
            "url": "https://files.pythonhosted.org/packages/9c/64/eb33d7c6da397dd89d9dfa3e12b14f0e3716b0c31b0ed9f4848f178eca3a/openllm_core-0.4.44-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6404db874982c94d7f371c589c6c1cbe7867ca3ac0887645d05aa6ab7a5a0ec4",
                "md5": "9a0bd7ed0bb98993f07827db86a00aae",
                "sha256": "168970c7c77d452d35a3dc62f6a1348e8d4375128bae3e840c5b93032197f094"
            },
            "downloads": -1,
            "filename": "openllm_core-0.4.44.tar.gz",
            "has_sig": false,
            "md5_digest": "9a0bd7ed0bb98993f07827db86a00aae",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 58248,
            "upload_time": "2024-02-06T03:08:52",
            "upload_time_iso_8601": "2024-02-06T03:08:52.584285Z",
            "url": "https://files.pythonhosted.org/packages/64/04/db874982c94d7f371c589c6c1cbe7867ca3ac0887645d05aa6ab7a5a0ec4/openllm_core-0.4.44.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-02-06 03:08:52",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "bentoml",
    "github_project": "OpenLLM",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openllm-core"
}
        
Elapsed time: 0.22993s