Name | llama-index-llms-dashscope JSON |
Version |
0.3.0
JSON |
| download |
home_page | None |
Summary | llama-index llms dashscope integration |
upload_time | 2024-11-18 00:22:06 |
maintainer | None |
docs_url | None |
author | Your Name |
requires_python | <4.0,>=3.9 |
license | MIT |
keywords |
|
VCS |
|
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# LlamaIndex Llms Integration: Dashscope
## Installation
1. Install the required Python package:
```bash
pip install llama-index-llms-dashscope
```
2. Set the DashScope API key as an environment variable:
```bash
export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY
```
Alternatively, you can set it in your Python script:
```python
import os
os.environ["DASHSCOPE_API_KEY"] = "YOUR_DASHSCOPE_API_KEY"
```
## Usage
### Basic Recipe Generation
To generate a basic vanilla cake recipe:
```python
from llama_index.llms.dashscope import DashScope, DashScopeGenerationModels
# Initialize DashScope object
dashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)
# Generate a vanilla cake recipe
resp = dashscope_llm.complete("How to make cake?")
print(resp)
```
### Streaming Recipe Responses
For real-time streamed responses:
```python
responses = dashscope_llm.stream_complete("How to make cake?")
for response in responses:
print(response.delta, end="")
```
### Multi-Round Conversation
To have a conversation with the assistant and ask for a sugar-free cake recipe:
```python
from llama_index.core.base.llms.types import MessageRole, ChatMessage
messages = [
ChatMessage(
role=MessageRole.SYSTEM, content="You are a helpful assistant."
),
ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]
# Get first round response
resp = dashscope_llm.chat(messages)
print(resp)
# Continue conversation
messages.append(
ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)
)
messages.append(
ChatMessage(role=MessageRole.USER, content="How to make it without sugar?")
)
# Get second round response
resp = dashscope_llm.chat(messages)
print(resp)
```
### Handling Sugar-Free Recipes
For sugar-free cake recipes using honey as a sweetener:
```python
resp = dashscope_llm.complete("How to make cake without sugar?")
print(resp)
```
### LLM Implementation example
https://docs.llamaindex.ai/en/stable/examples/llm/dashscope/
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-llms-dashscope",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": null,
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/99/03/b4f4a30e83f6d1efdc5f08390cfec8f4ce4b579f1998993e692d9ba0aedb/llama_index_llms_dashscope-0.3.0.tar.gz",
"platform": null,
"description": "# LlamaIndex Llms Integration: Dashscope\n\n## Installation\n\n1. Install the required Python package:\n\n ```bash\n pip install llama-index-llms-dashscope\n ```\n\n2. Set the DashScope API key as an environment variable:\n\n ```bash\n export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY\n ```\n\n Alternatively, you can set it in your Python script:\n\n ```python\n import os\n\n os.environ[\"DASHSCOPE_API_KEY\"] = \"YOUR_DASHSCOPE_API_KEY\"\n ```\n\n## Usage\n\n### Basic Recipe Generation\n\nTo generate a basic vanilla cake recipe:\n\n```python\nfrom llama_index.llms.dashscope import DashScope, DashScopeGenerationModels\n\n# Initialize DashScope object\ndashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)\n\n# Generate a vanilla cake recipe\nresp = dashscope_llm.complete(\"How to make cake?\")\nprint(resp)\n```\n\n### Streaming Recipe Responses\n\nFor real-time streamed responses:\n\n```python\nresponses = dashscope_llm.stream_complete(\"How to make cake?\")\nfor response in responses:\n print(response.delta, end=\"\")\n```\n\n### Multi-Round Conversation\n\nTo have a conversation with the assistant and ask for a sugar-free cake recipe:\n\n```python\nfrom llama_index.core.base.llms.types import MessageRole, ChatMessage\n\nmessages = [\n ChatMessage(\n role=MessageRole.SYSTEM, content=\"You are a helpful assistant.\"\n ),\n ChatMessage(role=MessageRole.USER, content=\"How to make cake?\"),\n]\n\n# Get first round response\nresp = dashscope_llm.chat(messages)\nprint(resp)\n\n# Continue conversation\nmessages.append(\n ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)\n)\nmessages.append(\n ChatMessage(role=MessageRole.USER, content=\"How to make it without sugar?\")\n)\n\n# Get second round response\nresp = dashscope_llm.chat(messages)\nprint(resp)\n```\n\n### Handling Sugar-Free Recipes\n\nFor sugar-free cake recipes using honey as a sweetener:\n\n```python\nresp = dashscope_llm.complete(\"How to make cake without sugar?\")\nprint(resp)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/dashscope/\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index llms dashscope integration",
"version": "0.3.0",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "604bca8921810f471eef61a10d2afd77b192658810cb6e45a5ce4ceeca85d857",
"md5": "7154e9f0af27785ca95d1373440be338",
"sha256": "cd9f17bddfa5b365aabda57cad807d16f47dbfac5383385d781bb68148b3fdb6"
},
"downloads": -1,
"filename": "llama_index_llms_dashscope-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7154e9f0af27785ca95d1373440be338",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 5932,
"upload_time": "2024-11-18T00:22:05",
"upload_time_iso_8601": "2024-11-18T00:22:05.438599Z",
"url": "https://files.pythonhosted.org/packages/60/4b/ca8921810f471eef61a10d2afd77b192658810cb6e45a5ce4ceeca85d857/llama_index_llms_dashscope-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9903b4f4a30e83f6d1efdc5f08390cfec8f4ce4b579f1998993e692d9ba0aedb",
"md5": "90b2ef0ab3102b885ba9855074c077ed",
"sha256": "c5cf190ac12896c3a1b4773b742fbbf81d6e04453338e872ceecfdd1370c6a45"
},
"downloads": -1,
"filename": "llama_index_llms_dashscope-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "90b2ef0ab3102b885ba9855074c077ed",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 5262,
"upload_time": "2024-11-18T00:22:06",
"upload_time_iso_8601": "2024-11-18T00:22:06.410212Z",
"url": "https://files.pythonhosted.org/packages/99/03/b4f4a30e83f6d1efdc5f08390cfec8f4ce4b579f1998993e692d9ba0aedb/llama_index_llms_dashscope-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 00:22:06",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-llms-dashscope"
}