llama-index-llms-anyscale


Namellama-index-llms-anyscale JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index llms anyscale integration
upload_time2024-11-18 01:28:25
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Anyscale

### Installation

```bash
%pip install llama-index-llms-anyscale
!pip install llama-index
```

### Basic Usage

```py
from llama_index.llms.anyscale import Anyscale
from llama_index.core.llms import ChatMessage

# Call chat with ChatMessage List
# You need to either set env var ANYSCALE_API_KEY or set api_key in the class constructor

# Example of setting API key through environment variable
# import os
# os.environ['ANYSCALE_API_KEY'] = '<your-api-key>'

# Initialize the Anyscale LLM with your API key
llm = Anyscale(api_key="<your-api-key>")

# Chat Example
message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

# Expected Output:
# assistant: Sure, here's a joke for you:
#
# Why couldn't the bicycle stand up by itself?
#
# Because it was two-tired!
#
# I hope that brought a smile to your face! Is there anything else I can assist you with?
```

### Streaming Example

```py
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

# Output Example:
# Once upon a time, there was a young girl named Maria who lived in a small village surrounded by lush green forests.
# Maria was a kind and gentle soul, loved by everyone in the village. She spent most of her days exploring the forests,
# discovering new species of plants and animals, and helping the villagers with their daily chores...
# (Story continues until it reaches the word limit.)
```

### Completion Example

```py
resp = llm.complete("Tell me a joke")
print(resp)

# Expected Output:
# assistant: Sure, here's a joke for you:
#
# Why couldn't the bicycle stand up by itself?
#
# Because it was two-tired!
```

### Streaming Completion Example

```py
resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

# Example Output:
# Once upon a time, there was a young girl named Maria who lived in a small village...
# (Stream continues as the story is generated.)
```

### Model Configuration

```py
llm = Anyscale(model="codellama/CodeLlama-34b-Instruct-hf")
resp = llm.complete("Show me the c++ code to send requests to HTTP Server")
print(resp)
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/anyscale/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-anyscale",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/10/b0/0dd5f3d598eb82527dcdfb80f1f98591b2a637b8c40732d43bb96fe0e638/llama_index_llms_anyscale-0.3.0.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Anyscale\n\n### Installation\n\n```bash\n%pip install llama-index-llms-anyscale\n!pip install llama-index\n```\n\n### Basic Usage\n\n```py\nfrom llama_index.llms.anyscale import Anyscale\nfrom llama_index.core.llms import ChatMessage\n\n# Call chat with ChatMessage List\n# You need to either set env var ANYSCALE_API_KEY or set api_key in the class constructor\n\n# Example of setting API key through environment variable\n# import os\n# os.environ['ANYSCALE_API_KEY'] = '<your-api-key>'\n\n# Initialize the Anyscale LLM with your API key\nllm = Anyscale(api_key=\"<your-api-key>\")\n\n# Chat Example\nmessage = ChatMessage(role=\"user\", content=\"Tell me a joke\")\nresp = llm.chat([message])\nprint(resp)\n\n# Expected Output:\n# assistant: Sure, here's a joke for you:\n#\n# Why couldn't the bicycle stand up by itself?\n#\n# Because it was two-tired!\n#\n# I hope that brought a smile to your face! Is there anything else I can assist you with?\n```\n\n### Streaming Example\n\n```py\nmessage = ChatMessage(role=\"user\", content=\"Tell me a story in 250 words\")\nresp = llm.stream_chat([message])\nfor r in resp:\n    print(r.delta, end=\"\")\n\n# Output Example:\n# Once upon a time, there was a young girl named Maria who lived in a small village surrounded by lush green forests.\n# Maria was a kind and gentle soul, loved by everyone in the village. She spent most of her days exploring the forests,\n# discovering new species of plants and animals, and helping the villagers with their daily chores...\n# (Story continues until it reaches the word limit.)\n```\n\n### Completion Example\n\n```py\nresp = llm.complete(\"Tell me a joke\")\nprint(resp)\n\n# Expected Output:\n# assistant: Sure, here's a joke for you:\n#\n# Why couldn't the bicycle stand up by itself?\n#\n# Because it was two-tired!\n```\n\n### Streaming Completion Example\n\n```py\nresp = llm.stream_complete(\"Tell me a story in 250 words\")\nfor r in resp:\n    print(r.delta, end=\"\")\n\n# Example Output:\n# Once upon a time, there was a young girl named Maria who lived in a small village...\n# (Stream continues as the story is generated.)\n```\n\n### Model Configuration\n\n```py\nllm = Anyscale(model=\"codellama/CodeLlama-34b-Instruct-hf\")\nresp = llm.complete(\"Show me the c++ code to send requests to HTTP Server\")\nprint(resp)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/anyscale/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms anyscale integration",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "36e1c72804bf5f7f12abc8b0eb553c2a9019a63107d66d589e026b5e5694e687",
                "md5": "c0b05e5d39a6a34b57a92f64fad9b4bd",
                "sha256": "90655064f3ed05efa7f053ee9ddeb2dff9d39d20ef2a250e46045acb60e7e020"
            },
            "downloads": -1,
            "filename": "llama_index_llms_anyscale-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "c0b05e5d39a6a34b57a92f64fad9b4bd",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 5098,
            "upload_time": "2024-11-18T01:28:24",
            "upload_time_iso_8601": "2024-11-18T01:28:24.338488Z",
            "url": "https://files.pythonhosted.org/packages/36/e1/c72804bf5f7f12abc8b0eb553c2a9019a63107d66d589e026b5e5694e687/llama_index_llms_anyscale-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "10b00dd5f3d598eb82527dcdfb80f1f98591b2a637b8c40732d43bb96fe0e638",
                "md5": "77a604465e0cd0745d8b12aea3fea727",
                "sha256": "187fddd7ba54aa929d19976b77f64fc370b4cfe6654316474e5644899384b290"
            },
            "downloads": -1,
            "filename": "llama_index_llms_anyscale-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "77a604465e0cd0745d8b12aea3fea727",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4388,
            "upload_time": "2024-11-18T01:28:25",
            "upload_time_iso_8601": "2024-11-18T01:28:25.838844Z",
            "url": "https://files.pythonhosted.org/packages/10/b0/0dd5f3d598eb82527dcdfb80f1f98591b2a637b8c40732d43bb96fe0e638/llama_index_llms_anyscale-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 01:28:25",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-anyscale"
}
        
Elapsed time: 0.61632s