orign


Nameorign JSON
Version 0.1.14 PyPI version JSON
download
home_pageNone
SummaryA Python client for Orign
upload_time2024-12-12 22:56:21
maintainerNone
docs_urlNone
authorPatrick Barker
requires_python<4.0,>=3.10
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # orign-py

A Python client for [Orign](https://github.com/agentsea/orign)

## Installation

```bash
pip install orign
```

Install the Orign CLI

```sh
curl -fsSL -H "Cache-Control: no-cache" https://storage.googleapis.com/orign/releases/install.sh | bash
```

Login to Orign

```sh
$ orign login
```

## Usage

Get a list of available models

```sh
$ orign get models
```

### Chat

Define which model we would like to use

```python
from orign import ChatModel

model = ChatModel(model="allenai/Molmo-7B-D-0924", provider="vllm")
```

Open a socket connection to the model

```python
model.connect()
```

Chat with the model

```python
model.chat(msg="What's in this image?", image="https://tinyurl.com/2fz6ms35")
```

Stream tokens from the model

```python
for response in model.chat(msg="What is the capital of France?", stream_tokens=True):
    print(response)
```

Send a thread of messages to the model

```python
model.chat(prompt=[
    {"role": "user", "content": "What is the capital of France?"},
    {"role": "assistant", "content": "Paris"},
    {"role": "user", "content": "When was it built?"}
])
```

Send a batch of threads to the model

```python
model.chat(batch=[
    [{"role": "user", "content": "What is the capital of France?"}, {"role": "assistant", "content": "Paris"}, {"role": "user", "content": "When was it built?"}],
    [{"role": "user", "content": "What is the capital of Spain?"}, {"role": "assistant", "content": "Madrid"}, {"role": "user", "content": "When was it built?"}]
]):
```

Use the async API

```python
from orign import AsyncChatModel

model = AsyncChatModel(model="allenai/Molmo-7B-D-0924", provider="vllm")
await model.connect()

async for response in model.chat(
    msg="What is the capital of france?", stream_tokens=True
):
    print(response)
```

### Embeddings
Define which model we would like to use

```python
from orign import EmbeddingModel

model = EmbeddingModel(provider="sentence-tf", model="clip-ViT-B-32")
```

Embed a text

```python
model.embed(text="What is the capital of France?")
```

Embed an image

```python
model.embed(image="https://example.com/image.jpg")
```

Embed text and image

```python
model.embed(text="What is the capital of France?", image="https://example.com/image.jpg")
```

Use the async API

```python
from orign import AsyncEmbeddingModel

model = AsyncEmbeddingModel(provider="sentence-tf", model="clip-ViT-B-32")
await model.connect()

await model.embed(text="What is the capital of France?")
```

### OCR

Define which model we would like to use

```python
from orign import OCRModel

model = OCRModel(provider="easyocr")
```

Detect text in an image

```python
model.detect(image="https://example.com/image.jpg")
```

Use the async API

```python
from orign import AsyncOCRModel

model = AsyncOCRModel(provider="doctr")
await model.connect()

await model.detect(image="https://example.com/image.jpg")
```

## Examples

See the [examples](examples) directory for more usage examples.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "orign",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Patrick Barker",
    "author_email": "patrickbarkerco@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/61/37/069fcfec2d72ccf9453477f70d613842cb2d29bef92ebc0851ae5e36b57a/orign-0.1.14.tar.gz",
    "platform": null,
    "description": "# orign-py\n\nA Python client for [Orign](https://github.com/agentsea/orign)\n\n## Installation\n\n```bash\npip install orign\n```\n\nInstall the Orign CLI\n\n```sh\ncurl -fsSL -H \"Cache-Control: no-cache\" https://storage.googleapis.com/orign/releases/install.sh | bash\n```\n\nLogin to Orign\n\n```sh\n$ orign login\n```\n\n## Usage\n\nGet a list of available models\n\n```sh\n$ orign get models\n```\n\n### Chat\n\nDefine which model we would like to use\n\n```python\nfrom orign import ChatModel\n\nmodel = ChatModel(model=\"allenai/Molmo-7B-D-0924\", provider=\"vllm\")\n```\n\nOpen a socket connection to the model\n\n```python\nmodel.connect()\n```\n\nChat with the model\n\n```python\nmodel.chat(msg=\"What's in this image?\", image=\"https://tinyurl.com/2fz6ms35\")\n```\n\nStream tokens from the model\n\n```python\nfor response in model.chat(msg=\"What is the capital of France?\", stream_tokens=True):\n    print(response)\n```\n\nSend a thread of messages to the model\n\n```python\nmodel.chat(prompt=[\n    {\"role\": \"user\", \"content\": \"What is the capital of France?\"},\n    {\"role\": \"assistant\", \"content\": \"Paris\"},\n    {\"role\": \"user\", \"content\": \"When was it built?\"}\n])\n```\n\nSend a batch of threads to the model\n\n```python\nmodel.chat(batch=[\n    [{\"role\": \"user\", \"content\": \"What is the capital of France?\"}, {\"role\": \"assistant\", \"content\": \"Paris\"}, {\"role\": \"user\", \"content\": \"When was it built?\"}],\n    [{\"role\": \"user\", \"content\": \"What is the capital of Spain?\"}, {\"role\": \"assistant\", \"content\": \"Madrid\"}, {\"role\": \"user\", \"content\": \"When was it built?\"}]\n]):\n```\n\nUse the async API\n\n```python\nfrom orign import AsyncChatModel\n\nmodel = AsyncChatModel(model=\"allenai/Molmo-7B-D-0924\", provider=\"vllm\")\nawait model.connect()\n\nasync for response in model.chat(\n    msg=\"What is the capital of france?\", stream_tokens=True\n):\n    print(response)\n```\n\n### Embeddings\nDefine which model we would like to use\n\n```python\nfrom orign import EmbeddingModel\n\nmodel = EmbeddingModel(provider=\"sentence-tf\", model=\"clip-ViT-B-32\")\n```\n\nEmbed a text\n\n```python\nmodel.embed(text=\"What is the capital of France?\")\n```\n\nEmbed an image\n\n```python\nmodel.embed(image=\"https://example.com/image.jpg\")\n```\n\nEmbed text and image\n\n```python\nmodel.embed(text=\"What is the capital of France?\", image=\"https://example.com/image.jpg\")\n```\n\nUse the async API\n\n```python\nfrom orign import AsyncEmbeddingModel\n\nmodel = AsyncEmbeddingModel(provider=\"sentence-tf\", model=\"clip-ViT-B-32\")\nawait model.connect()\n\nawait model.embed(text=\"What is the capital of France?\")\n```\n\n### OCR\n\nDefine which model we would like to use\n\n```python\nfrom orign import OCRModel\n\nmodel = OCRModel(provider=\"easyocr\")\n```\n\nDetect text in an image\n\n```python\nmodel.detect(image=\"https://example.com/image.jpg\")\n```\n\nUse the async API\n\n```python\nfrom orign import AsyncOCRModel\n\nmodel = AsyncOCRModel(provider=\"doctr\")\nawait model.connect()\n\nawait model.detect(image=\"https://example.com/image.jpg\")\n```\n\n## Examples\n\nSee the [examples](examples) directory for more usage examples.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python client for Orign",
    "version": "0.1.14",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "cfd822e5f1abb8c0bec92846c538aaad4612740be2d35c49d91dca106d0f7a20",
                "md5": "4cf2a048eaea516687a5f97bb96aea77",
                "sha256": "994e5080da9787601051bf626c71f8b6b8148f22a4b54415217abf3514b319c0"
            },
            "downloads": -1,
            "filename": "orign-0.1.14-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4cf2a048eaea516687a5f97bb96aea77",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 14366,
            "upload_time": "2024-12-12T22:56:19",
            "upload_time_iso_8601": "2024-12-12T22:56:19.123705Z",
            "url": "https://files.pythonhosted.org/packages/cf/d8/22e5f1abb8c0bec92846c538aaad4612740be2d35c49d91dca106d0f7a20/orign-0.1.14-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6137069fcfec2d72ccf9453477f70d613842cb2d29bef92ebc0851ae5e36b57a",
                "md5": "f7a5ec2c3bec95bc75bee156c435b9b2",
                "sha256": "f4d08781da9e25a8fd40d77bfd23af9eceda0548f8967fb7e745fdb4e03d48ce"
            },
            "downloads": -1,
            "filename": "orign-0.1.14.tar.gz",
            "has_sig": false,
            "md5_digest": "f7a5ec2c3bec95bc75bee156c435b9b2",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 8141,
            "upload_time": "2024-12-12T22:56:21",
            "upload_time_iso_8601": "2024-12-12T22:56:21.317986Z",
            "url": "https://files.pythonhosted.org/packages/61/37/069fcfec2d72ccf9453477f70d613842cb2d29bef92ebc0851ae5e36b57a/orign-0.1.14.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-12 22:56:21",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "orign"
}
        
Elapsed time: 0.37551s