openlm


Nameopenlm JSON
Version 0.0.5 PyPI version JSON
download
home_page
SummaryDrop-in OpenAI-compatible that can call LLMs from other providers
upload_time2023-05-19 16:42:25
maintainerMatt Rickard
docs_urlNone
authorMatt Rickard
requires_python>=3.8.1,<4.0
licenseMIT
keywords llm ai prompt large language models gpt-3 chatgpt
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # OpenLM

Drop-in OpenAI-compatible library that can call LLMs from other providers (e.g., HuggingFace, Cohere, and more). 

```diff
1c1
< import openai
---
> import openlm as openai

completion = openai.Completion.create(
    model=["bloom-560m", "cohere.ai/command"], 
    prompt=["Hello world!", "A second prompt!"]
)
print(completion)
```
### Features
* Takes in the same parameters as OpenAI's Completion API and returns a similarly structured response. 
* Call models from HuggingFace's inference endpoint API, Cohere.ai, OpenAI, or your custom implementation. 
* Complete multiple prompts on multiple models in the same request. 
* Very small footprint: OpenLM calls the inference APIs directly rather than using multiple SDKs.


### Installation
```bash
pip install openlm
```

### Examples

- [Import as OpenAI](examples/as_openai.py)
- [Set up API keys via environment variables or pass a dict](examples/api_keys.py)
- [Add a custom model or provider](examples/custom_provider.py)
- [Complete multiple prompts on multiple models](examples/multiplex.py)

OpenLM currently supports the Completion endpoint, but over time will support more standardized endpoints that make sense. 

### [Example with Response](examples/multiplex.py)

```python
import sys
from pathlib import Path

sys.path.append(str(Path(__file__).resolve().parent.parent))

import openlm 
import json

completion = openlm.Completion.create(
    model=["ada", "huggingface.co/gpt2", "cohere.ai/command"],
    prompt=["The quick brown fox", "Who jumped over the lazy dog?"],
    max_tokens=15
)
print(json.dumps(completion, indent=4))
```

```json
{
    "id": "504cc502-dc27-43e7-bcc3-b62e178c247e",
    "object": "text_completion",
    "created": 1683583267,
    "choices": [
        {
            "id": "c0487ba2-935d-4dec-b191-f7eff962f117",
            "model_idx": 0,
            "model_name": "openai.com/ada",
            "index": 0,
            "created": 1683583233,
            "text": " jumps into the much bigger brown bush.\" \"Alright, people like you can",
            "usage": {
                "prompt_tokens": 4,
                "completion_tokens": 15,
                "total_tokens": 19
            },
            "extra": {
                "id": "cmpl-7E3CCSpJHXfx5yB0TaJU9ON7rNYPT"
            }
        },
        {
            "id": "bab92d11-5ba6-4da2-acca-1f3398a78c3e",
            "model_idx": 0,
            "model_name": "openai.com/ada",
            "index": 1,
            "created": 1683583233,
            "text": "\n\nIt turns out that saying one's name \"Joe\" is the",
            "usage": {
                "prompt_tokens": 7,
                "completion_tokens": 15,
                "total_tokens": 22
            },
            "extra": {
                "id": "cmpl-7E3CDBbqFy92I2ZbSGoDT5ickAiPD"
            }
        },
        {
            "id": "be870636-9d9e-4f74-b8bd-d04766072a7b",
            "model_idx": 1,
            "model_name": "huggingface.co/gpt2",
            "index": 0,
            "created": 1683583234,
            "text": "The quick brown foxes, and the short, snuggly fox-scented, soft foxes we have in our household\u2026 all come in two distinct flavours: yellow and orange; and red and white. This mixture is often confused with"
        },
        {
            "id": "c1abf535-54a9-4b72-8681-d3b4a601da88",
            "model_idx": 1,
            "model_name": "huggingface.co/gpt2",
            "index": 1,
            "created": 1683583266,
            "text": "Who jumped over the lazy dog? He probably got it, but there's only so much you do when you lose one.\n\nBut I will say for a moment that there's no way this guy might have picked a fight with Donald Trump."
        },
        {
            "id": "08e8c351-236a-4497-98f3-488cdc0b6b6a",
            "model_idx": 2,
            "model_name": "cohere.ai/command",
            "index": 0,
            "created": 1683583267,
            "text": "\njumps over the lazy dog.",
            "extra": {
                "request_id": "0bbb28c0-eb3d-4614-b4d9-1eca88c361ca",
                "generation_id": "5288dd6f-3ecf-475b-b909-0b226be6a193"
            }
        },
        {
            "id": "49ce51e6-9a18-4093-957f-54a1557c8829",
            "model_idx": 2,
            "model_name": "cohere.ai/command",
            "index": 1,
            "created": 1683583267,
            "text": "\nThe quick brown fox.",
            "extra": {
                "request_id": "ab5d5e03-22a1-42cd-85b2-9b9704c79304",
                "generation_id": "60493966-abf6-483c-9c47-2ea5c5eeb855"
            }
        }
    ],
    "usage": {
        "prompt_tokens": 11,
        "completion_tokens": 30,
        "total_tokens": 41
    }
}
```

### Other Languages
[r2d4/llm.ts](https://github.com/r2d4/llm.ts) is a TypeScript library that has a similar API that sits on top of multiple language models.

### Roadmap
- [ ] Streaming API
- [ ] Embeddings API

### Contributing
Contributions are welcome! Please open an issue or submit a PR.

### License
[MIT](LICENSE)


            

Raw data

            {
    "_id": null,
    "home_page": "",
    "name": "openlm",
    "maintainer": "Matt Rickard",
    "docs_url": null,
    "requires_python": ">=3.8.1,<4.0",
    "maintainer_email": "pypi@matt-rickard.com",
    "keywords": "llm,ai,prompt,large language models,gpt-3,chatgpt",
    "author": "Matt Rickard",
    "author_email": "pypi@matt-rickard.com",
    "download_url": "https://files.pythonhosted.org/packages/9d/74/72ba19f027803c5dd2cb04c9fdf4fadd1182d847d039a68f7a7c78860ef6/openlm-0.0.5.tar.gz",
    "platform": null,
    "description": "# OpenLM\n\nDrop-in OpenAI-compatible library that can call LLMs from other providers (e.g., HuggingFace, Cohere, and more). \n\n```diff\n1c1\n< import openai\n---\n> import openlm as openai\n\ncompletion = openai.Completion.create(\n    model=[\"bloom-560m\", \"cohere.ai/command\"], \n    prompt=[\"Hello world!\", \"A second prompt!\"]\n)\nprint(completion)\n```\n### Features\n* Takes in the same parameters as OpenAI's Completion API and returns a similarly structured response. \n* Call models from HuggingFace's inference endpoint API, Cohere.ai, OpenAI, or your custom implementation. \n* Complete multiple prompts on multiple models in the same request. \n* Very small footprint: OpenLM calls the inference APIs directly rather than using multiple SDKs.\n\n\n### Installation\n```bash\npip install openlm\n```\n\n### Examples\n\n- [Import as OpenAI](examples/as_openai.py)\n- [Set up API keys via environment variables or pass a dict](examples/api_keys.py)\n- [Add a custom model or provider](examples/custom_provider.py)\n- [Complete multiple prompts on multiple models](examples/multiplex.py)\n\nOpenLM currently supports the Completion endpoint, but over time will support more standardized endpoints that make sense. \n\n### [Example with Response](examples/multiplex.py)\n\n```python\nimport sys\nfrom pathlib import Path\n\nsys.path.append(str(Path(__file__).resolve().parent.parent))\n\nimport openlm \nimport json\n\ncompletion = openlm.Completion.create(\n    model=[\"ada\", \"huggingface.co/gpt2\", \"cohere.ai/command\"],\n    prompt=[\"The quick brown fox\", \"Who jumped over the lazy dog?\"],\n    max_tokens=15\n)\nprint(json.dumps(completion, indent=4))\n```\n\n```json\n{\n    \"id\": \"504cc502-dc27-43e7-bcc3-b62e178c247e\",\n    \"object\": \"text_completion\",\n    \"created\": 1683583267,\n    \"choices\": [\n        {\n            \"id\": \"c0487ba2-935d-4dec-b191-f7eff962f117\",\n            \"model_idx\": 0,\n            \"model_name\": \"openai.com/ada\",\n            \"index\": 0,\n            \"created\": 1683583233,\n            \"text\": \" jumps into the much bigger brown bush.\\\" \\\"Alright, people like you can\",\n            \"usage\": {\n                \"prompt_tokens\": 4,\n                \"completion_tokens\": 15,\n                \"total_tokens\": 19\n            },\n            \"extra\": {\n                \"id\": \"cmpl-7E3CCSpJHXfx5yB0TaJU9ON7rNYPT\"\n            }\n        },\n        {\n            \"id\": \"bab92d11-5ba6-4da2-acca-1f3398a78c3e\",\n            \"model_idx\": 0,\n            \"model_name\": \"openai.com/ada\",\n            \"index\": 1,\n            \"created\": 1683583233,\n            \"text\": \"\\n\\nIt turns out that saying one's name \\\"Joe\\\" is the\",\n            \"usage\": {\n                \"prompt_tokens\": 7,\n                \"completion_tokens\": 15,\n                \"total_tokens\": 22\n            },\n            \"extra\": {\n                \"id\": \"cmpl-7E3CDBbqFy92I2ZbSGoDT5ickAiPD\"\n            }\n        },\n        {\n            \"id\": \"be870636-9d9e-4f74-b8bd-d04766072a7b\",\n            \"model_idx\": 1,\n            \"model_name\": \"huggingface.co/gpt2\",\n            \"index\": 0,\n            \"created\": 1683583234,\n            \"text\": \"The quick brown foxes, and the short, snuggly fox-scented, soft foxes we have in our household\\u2026 all come in two distinct flavours: yellow and orange; and red and white. This mixture is often confused with\"\n        },\n        {\n            \"id\": \"c1abf535-54a9-4b72-8681-d3b4a601da88\",\n            \"model_idx\": 1,\n            \"model_name\": \"huggingface.co/gpt2\",\n            \"index\": 1,\n            \"created\": 1683583266,\n            \"text\": \"Who jumped over the lazy dog? He probably got it, but there's only so much you do when you lose one.\\n\\nBut I will say for a moment that there's no way this guy might have picked a fight with Donald Trump.\"\n        },\n        {\n            \"id\": \"08e8c351-236a-4497-98f3-488cdc0b6b6a\",\n            \"model_idx\": 2,\n            \"model_name\": \"cohere.ai/command\",\n            \"index\": 0,\n            \"created\": 1683583267,\n            \"text\": \"\\njumps over the lazy dog.\",\n            \"extra\": {\n                \"request_id\": \"0bbb28c0-eb3d-4614-b4d9-1eca88c361ca\",\n                \"generation_id\": \"5288dd6f-3ecf-475b-b909-0b226be6a193\"\n            }\n        },\n        {\n            \"id\": \"49ce51e6-9a18-4093-957f-54a1557c8829\",\n            \"model_idx\": 2,\n            \"model_name\": \"cohere.ai/command\",\n            \"index\": 1,\n            \"created\": 1683583267,\n            \"text\": \"\\nThe quick brown fox.\",\n            \"extra\": {\n                \"request_id\": \"ab5d5e03-22a1-42cd-85b2-9b9704c79304\",\n                \"generation_id\": \"60493966-abf6-483c-9c47-2ea5c5eeb855\"\n            }\n        }\n    ],\n    \"usage\": {\n        \"prompt_tokens\": 11,\n        \"completion_tokens\": 30,\n        \"total_tokens\": 41\n    }\n}\n```\n\n### Other Languages\n[r2d4/llm.ts](https://github.com/r2d4/llm.ts) is a TypeScript library that has a similar API that sits on top of multiple language models.\n\n### Roadmap\n- [ ] Streaming API\n- [ ] Embeddings API\n\n### Contributing\nContributions are welcome! Please open an issue or submit a PR.\n\n### License\n[MIT](LICENSE)\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Drop-in OpenAI-compatible that can call LLMs from other providers",
    "version": "0.0.5",
    "project_urls": {
        "repository": "https://github.com/r2d4/openlm"
    },
    "split_keywords": [
        "llm",
        "ai",
        "prompt",
        "large language models",
        "gpt-3",
        "chatgpt"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "7f80756e132b07d32c7fcb8211321fedeb0b9dc33d6d771dbefbf71623605003",
                "md5": "5eeb448be8e90decd272631737aeb4f7",
                "sha256": "9fcbbc575d2869e2a6c0b00827f9be2189c067c2de4bf03ef3cbdf488367ae93"
            },
            "downloads": -1,
            "filename": "openlm-0.0.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "5eeb448be8e90decd272631737aeb4f7",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8.1,<4.0",
            "size": 10672,
            "upload_time": "2023-05-19T16:42:22",
            "upload_time_iso_8601": "2023-05-19T16:42:22.263696Z",
            "url": "https://files.pythonhosted.org/packages/7f/80/756e132b07d32c7fcb8211321fedeb0b9dc33d6d771dbefbf71623605003/openlm-0.0.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "9d7472ba19f027803c5dd2cb04c9fdf4fadd1182d847d039a68f7a7c78860ef6",
                "md5": "3b388d03df5720bb8438e9859c2d9e19",
                "sha256": "0eb3fd7a9e4f7b4248931ff2f0dc91c525d990b99956886861a1b3f9868bc451"
            },
            "downloads": -1,
            "filename": "openlm-0.0.5.tar.gz",
            "has_sig": false,
            "md5_digest": "3b388d03df5720bb8438e9859c2d9e19",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8.1,<4.0",
            "size": 9405,
            "upload_time": "2023-05-19T16:42:25",
            "upload_time_iso_8601": "2023-05-19T16:42:25.531461Z",
            "url": "https://files.pythonhosted.org/packages/9d/74/72ba19f027803c5dd2cb04c9fdf4fadd1182d847d039a68f7a7c78860ef6/openlm-0.0.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2023-05-19 16:42:25",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "r2d4",
    "github_project": "openlm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "openlm"
}
        
Elapsed time: 0.90896s