| Name | llama-index-llms-anyscale JSON |
| Version |
0.4.0
JSON |
| download |
| home_page | None |
| Summary | llama-index llms anyscale integration |
| upload_time | 2025-07-30 21:35:09 |
| maintainer | None |
| docs_url | None |
| author | None |
| requires_python | <4.0,>=3.9 |
| license | None |
| keywords |
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# LlamaIndex Llms Integration: Anyscale
### Installation
```bash
%pip install llama-index-llms-anyscale
!pip install llama-index
```
### Basic Usage
```py
from llama_index.llms.anyscale import Anyscale
from llama_index.core.llms import ChatMessage
# Call chat with ChatMessage List
# You need to either set env var ANYSCALE_API_KEY or set api_key in the class constructor
# Example of setting API key through environment variable
# import os
# os.environ['ANYSCALE_API_KEY'] = '<your-api-key>'
# Initialize the Anyscale LLM with your API key
llm = Anyscale(api_key="<your-api-key>")
# Chat Example
message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)
# Expected Output:
# assistant: Sure, here's a joke for you:
#
# Why couldn't the bicycle stand up by itself?
#
# Because it was two-tired!
#
# I hope that brought a smile to your face! Is there anything else I can assist you with?
```
### Streaming Example
```py
message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
print(r.delta, end="")
# Output Example:
# Once upon a time, there was a young girl named Maria who lived in a small village surrounded by lush green forests.
# Maria was a kind and gentle soul, loved by everyone in the village. She spent most of her days exploring the forests,
# discovering new species of plants and animals, and helping the villagers with their daily chores...
# (Story continues until it reaches the word limit.)
```
### Completion Example
```py
resp = llm.complete("Tell me a joke")
print(resp)
# Expected Output:
# assistant: Sure, here's a joke for you:
#
# Why couldn't the bicycle stand up by itself?
#
# Because it was two-tired!
```
### Streaming Completion Example
```py
resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
print(r.delta, end="")
# Example Output:
# Once upon a time, there was a young girl named Maria who lived in a small village...
# (Stream continues as the story is generated.)
```
### Model Configuration
```py
llm = Anyscale(model="codellama/CodeLlama-34b-Instruct-hf")
resp = llm.complete("Show me the c++ code to send requests to HTTP Server")
print(resp)
```
### LLM Implementation example
https://docs.llamaindex.ai/en/stable/examples/llm/anyscale/
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-llms-anyscale",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": "Your Name <you@example.com>",
"download_url": "https://files.pythonhosted.org/packages/84/dd/5f21180ad36bb7193436f3bbc8e91549700635a6ef90080c99826e034bc2/llama_index_llms_anyscale-0.4.0.tar.gz",
"platform": null,
"description": "# LlamaIndex Llms Integration: Anyscale\n\n### Installation\n\n```bash\n%pip install llama-index-llms-anyscale\n!pip install llama-index\n```\n\n### Basic Usage\n\n```py\nfrom llama_index.llms.anyscale import Anyscale\nfrom llama_index.core.llms import ChatMessage\n\n# Call chat with ChatMessage List\n# You need to either set env var ANYSCALE_API_KEY or set api_key in the class constructor\n\n# Example of setting API key through environment variable\n# import os\n# os.environ['ANYSCALE_API_KEY'] = '<your-api-key>'\n\n# Initialize the Anyscale LLM with your API key\nllm = Anyscale(api_key=\"<your-api-key>\")\n\n# Chat Example\nmessage = ChatMessage(role=\"user\", content=\"Tell me a joke\")\nresp = llm.chat([message])\nprint(resp)\n\n# Expected Output:\n# assistant: Sure, here's a joke for you:\n#\n# Why couldn't the bicycle stand up by itself?\n#\n# Because it was two-tired!\n#\n# I hope that brought a smile to your face! Is there anything else I can assist you with?\n```\n\n### Streaming Example\n\n```py\nmessage = ChatMessage(role=\"user\", content=\"Tell me a story in 250 words\")\nresp = llm.stream_chat([message])\nfor r in resp:\n print(r.delta, end=\"\")\n\n# Output Example:\n# Once upon a time, there was a young girl named Maria who lived in a small village surrounded by lush green forests.\n# Maria was a kind and gentle soul, loved by everyone in the village. She spent most of her days exploring the forests,\n# discovering new species of plants and animals, and helping the villagers with their daily chores...\n# (Story continues until it reaches the word limit.)\n```\n\n### Completion Example\n\n```py\nresp = llm.complete(\"Tell me a joke\")\nprint(resp)\n\n# Expected Output:\n# assistant: Sure, here's a joke for you:\n#\n# Why couldn't the bicycle stand up by itself?\n#\n# Because it was two-tired!\n```\n\n### Streaming Completion Example\n\n```py\nresp = llm.stream_complete(\"Tell me a story in 250 words\")\nfor r in resp:\n print(r.delta, end=\"\")\n\n# Example Output:\n# Once upon a time, there was a young girl named Maria who lived in a small village...\n# (Stream continues as the story is generated.)\n```\n\n### Model Configuration\n\n```py\nllm = Anyscale(model=\"codellama/CodeLlama-34b-Instruct-hf\")\nresp = llm.complete(\"Show me the c++ code to send requests to HTTP Server\")\nprint(resp)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/anyscale/\n",
"bugtrack_url": null,
"license": null,
"summary": "llama-index llms anyscale integration",
"version": "0.4.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c54eac554396b4b25d7226961ae03c50b0999c8302ee775c238e7e27fc874f05",
"md5": "5f3c4bb39754e4eaa0fd1bcf27203a3d",
"sha256": "37e249b439c86d11e1dba21ee9a0685bf5c57435429899b229564ee5bfb1cfff"
},
"downloads": -1,
"filename": "llama_index_llms_anyscale-0.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5f3c4bb39754e4eaa0fd1bcf27203a3d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 5916,
"upload_time": "2025-07-30T21:35:08",
"upload_time_iso_8601": "2025-07-30T21:35:08.984431Z",
"url": "https://files.pythonhosted.org/packages/c5/4e/ac554396b4b25d7226961ae03c50b0999c8302ee775c238e7e27fc874f05/llama_index_llms_anyscale-0.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "84dd5f21180ad36bb7193436f3bbc8e91549700635a6ef90080c99826e034bc2",
"md5": "8cbfc8d9404f7ff239ce947e03a61a7d",
"sha256": "e59927075f7628117971668b286ada3a453aba4a604f90aaad533275903b7a2e"
},
"downloads": -1,
"filename": "llama_index_llms_anyscale-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "8cbfc8d9404f7ff239ce947e03a61a7d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 5809,
"upload_time": "2025-07-30T21:35:09",
"upload_time_iso_8601": "2025-07-30T21:35:09.894326Z",
"url": "https://files.pythonhosted.org/packages/84/dd/5f21180ad36bb7193436f3bbc8e91549700635a6ef90080c99826e034bc2/llama_index_llms_anyscale-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-30 21:35:09",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-llms-anyscale"
}