llama-index-embeddings-nebius


Namellama-index-embeddings-nebius JSON
Version 0.3.1 PyPI version JSON
download
home_pageNone
Summaryllama-index embeddings Nebius AI Studio integration
upload_time2024-11-23 18:12:02
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Embeddings Integration: [Nebius AI Studio](https://studio.nebius.ai/)

## Overview

Integrate with Nebius AI Studio API, which provides access to open-source state-of-the-art text embeddings models.

## Installation

```bash
pip install llama-index-embeddings-nebius
```

## Usage

### Initialization

#### With environmental variables.

```.env
NEBIUS_API_KEY=your_api_key

```

```python
from llama_index.embeddings.nebius import NebiusEmbedding

embed_model = NebiusEmbedding(model_name="BAAI/bge-en-icl")
```

#### Without environmental variables

```python
from llama_index.embeddings.nebius import NebiusEmbedding

embed_model = NebiusEmbedding(
    api_key="your_api_key", model_name="BAAI/bge-en-icl"
)
```

### Launching

#### Basic usage

```python
text = "Everyone loves justice at another person's expense"
embeddings = embed_model.get_text_embedding(text)
print(embeddings[:5])
```

#### Asynchronous usage

```python
text = "Everyone loves justice at another person's expense"
embeddings = await embed_model.aget_text_embedding(text)
print(embeddings[:5])
```

#### Batched usage

```python
texts = [
    "As the hours pass",
    "I will let you know",
    "That I need to ask",
    "Before I'm alone",
]

embeddings = embed_model.get_text_embedding_batch(texts)
print(*[x[:3] for x in embeddings], sep="\n")
```

#### Batched asynchronous usage

```python
texts = [
    "As the hours pass",
    "I will let you know",
    "That I need to ask",
    "Before I'm alone",
]

embeddings = await embed_model.aget_text_embedding_batch(texts)
print(*[x[:3] for x in embeddings], sep="\n")
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-embeddings-nebius",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/3b/1d/55914a73a9eeedf971383ede4505782a800e47621061480679b4e89e3880/llama_index_embeddings_nebius-0.3.1.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Embeddings Integration: [Nebius AI Studio](https://studio.nebius.ai/)\n\n## Overview\n\nIntegrate with Nebius AI Studio API, which provides access to open-source state-of-the-art text embeddings models.\n\n## Installation\n\n```bash\npip install llama-index-embeddings-nebius\n```\n\n## Usage\n\n### Initialization\n\n#### With environmental variables.\n\n```.env\nNEBIUS_API_KEY=your_api_key\n\n```\n\n```python\nfrom llama_index.embeddings.nebius import NebiusEmbedding\n\nembed_model = NebiusEmbedding(model_name=\"BAAI/bge-en-icl\")\n```\n\n#### Without environmental variables\n\n```python\nfrom llama_index.embeddings.nebius import NebiusEmbedding\n\nembed_model = NebiusEmbedding(\n    api_key=\"your_api_key\", model_name=\"BAAI/bge-en-icl\"\n)\n```\n\n### Launching\n\n#### Basic usage\n\n```python\ntext = \"Everyone loves justice at another person's expense\"\nembeddings = embed_model.get_text_embedding(text)\nprint(embeddings[:5])\n```\n\n#### Asynchronous usage\n\n```python\ntext = \"Everyone loves justice at another person's expense\"\nembeddings = await embed_model.aget_text_embedding(text)\nprint(embeddings[:5])\n```\n\n#### Batched usage\n\n```python\ntexts = [\n    \"As the hours pass\",\n    \"I will let you know\",\n    \"That I need to ask\",\n    \"Before I'm alone\",\n]\n\nembeddings = embed_model.get_text_embedding_batch(texts)\nprint(*[x[:3] for x in embeddings], sep=\"\\n\")\n```\n\n#### Batched asynchronous usage\n\n```python\ntexts = [\n    \"As the hours pass\",\n    \"I will let you know\",\n    \"That I need to ask\",\n    \"Before I'm alone\",\n]\n\nembeddings = await embed_model.aget_text_embedding_batch(texts)\nprint(*[x[:3] for x in embeddings], sep=\"\\n\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index embeddings Nebius AI Studio integration",
    "version": "0.3.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ee8b166b14447b300ce68403930f8394e38bb33be7ec5d018fb68b96df59417b",
                "md5": "c13e7158fb0c157c9df2f06fe73559e4",
                "sha256": "145336d5718be6b398b2edbdeef968cd252bfe6e46f3e6d293fda04ae3c72e2c"
            },
            "downloads": -1,
            "filename": "llama_index_embeddings_nebius-0.3.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c13e7158fb0c157c9df2f06fe73559e4",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 3556,
            "upload_time": "2024-11-23T18:12:00",
            "upload_time_iso_8601": "2024-11-23T18:12:00.961465Z",
            "url": "https://files.pythonhosted.org/packages/ee/8b/166b14447b300ce68403930f8394e38bb33be7ec5d018fb68b96df59417b/llama_index_embeddings_nebius-0.3.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "3b1d55914a73a9eeedf971383ede4505782a800e47621061480679b4e89e3880",
                "md5": "2ec81ffd472f17f0e8c1aee25b687601",
                "sha256": "0a1fb089c42095b9b11f7cbf1938e38b543a9c27e685c1577ed2fb1148b13a14"
            },
            "downloads": -1,
            "filename": "llama_index_embeddings_nebius-0.3.1.tar.gz",
            "has_sig": false,
            "md5_digest": "2ec81ffd472f17f0e8c1aee25b687601",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 2882,
            "upload_time": "2024-11-23T18:12:02",
            "upload_time_iso_8601": "2024-11-23T18:12:02.479260Z",
            "url": "https://files.pythonhosted.org/packages/3b/1d/55914a73a9eeedf971383ede4505782a800e47621061480679b4e89e3880/llama_index_embeddings_nebius-0.3.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-23 18:12:02",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-embeddings-nebius"
}
        
Elapsed time: 0.34308s