llama-index-llms-dashscope


Namellama-index-llms-dashscope JSON
Version 0.2.2 PyPI version JSON
download
home_pageNone
Summaryllama-index llms dashscope integration
upload_time2024-10-08 22:26:05
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.8.1
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Dashscope

## Installation

1. Install the required Python package:

   ```bash
   pip install llama-index-llms-dashscope
   ```

2. Set the DashScope API key as an environment variable:

   ```bash
   export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY
   ```

   Alternatively, you can set it in your Python script:

   ```python
   import os

   os.environ["DASHSCOPE_API_KEY"] = "YOUR_DASHSCOPE_API_KEY"
   ```

## Usage

### Basic Recipe Generation

To generate a basic vanilla cake recipe:

```python
from llama_index.llms.dashscope import DashScope, DashScopeGenerationModels

# Initialize DashScope object
dashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)

# Generate a vanilla cake recipe
resp = dashscope_llm.complete("How to make cake?")
print(resp)
```

### Streaming Recipe Responses

For real-time streamed responses:

```python
responses = dashscope_llm.stream_complete("How to make cake?")
for response in responses:
    print(response.delta, end="")
```

### Multi-Round Conversation

To have a conversation with the assistant and ask for a sugar-free cake recipe:

```python
from llama_index.core.base.llms.types import MessageRole, ChatMessage

messages = [
    ChatMessage(
        role=MessageRole.SYSTEM, content="You are a helpful assistant."
    ),
    ChatMessage(role=MessageRole.USER, content="How to make cake?"),
]

# Get first round response
resp = dashscope_llm.chat(messages)
print(resp)

# Continue conversation
messages.append(
    ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)
)
messages.append(
    ChatMessage(role=MessageRole.USER, content="How to make it without sugar?")
)

# Get second round response
resp = dashscope_llm.chat(messages)
print(resp)
```

### Handling Sugar-Free Recipes

For sugar-free cake recipes using honey as a sweetener:

```python
resp = dashscope_llm.complete("How to make cake without sugar?")
print(resp)
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/dashscope/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-dashscope",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8.1",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/1d/59/678e38ec687ed9eff053715574a8cc370aa2ad77991fc5a21115d676b36d/llama_index_llms_dashscope-0.2.2.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Dashscope\n\n## Installation\n\n1. Install the required Python package:\n\n   ```bash\n   pip install llama-index-llms-dashscope\n   ```\n\n2. Set the DashScope API key as an environment variable:\n\n   ```bash\n   export DASHSCOPE_API_KEY=YOUR_DASHSCOPE_API_KEY\n   ```\n\n   Alternatively, you can set it in your Python script:\n\n   ```python\n   import os\n\n   os.environ[\"DASHSCOPE_API_KEY\"] = \"YOUR_DASHSCOPE_API_KEY\"\n   ```\n\n## Usage\n\n### Basic Recipe Generation\n\nTo generate a basic vanilla cake recipe:\n\n```python\nfrom llama_index.llms.dashscope import DashScope, DashScopeGenerationModels\n\n# Initialize DashScope object\ndashscope_llm = DashScope(model_name=DashScopeGenerationModels.QWEN_MAX)\n\n# Generate a vanilla cake recipe\nresp = dashscope_llm.complete(\"How to make cake?\")\nprint(resp)\n```\n\n### Streaming Recipe Responses\n\nFor real-time streamed responses:\n\n```python\nresponses = dashscope_llm.stream_complete(\"How to make cake?\")\nfor response in responses:\n    print(response.delta, end=\"\")\n```\n\n### Multi-Round Conversation\n\nTo have a conversation with the assistant and ask for a sugar-free cake recipe:\n\n```python\nfrom llama_index.core.base.llms.types import MessageRole, ChatMessage\n\nmessages = [\n    ChatMessage(\n        role=MessageRole.SYSTEM, content=\"You are a helpful assistant.\"\n    ),\n    ChatMessage(role=MessageRole.USER, content=\"How to make cake?\"),\n]\n\n# Get first round response\nresp = dashscope_llm.chat(messages)\nprint(resp)\n\n# Continue conversation\nmessages.append(\n    ChatMessage(role=MessageRole.ASSISTANT, content=resp.message.content)\n)\nmessages.append(\n    ChatMessage(role=MessageRole.USER, content=\"How to make it without sugar?\")\n)\n\n# Get second round response\nresp = dashscope_llm.chat(messages)\nprint(resp)\n```\n\n### Handling Sugar-Free Recipes\n\nFor sugar-free cake recipes using honey as a sweetener:\n\n```python\nresp = dashscope_llm.complete(\"How to make cake without sugar?\")\nprint(resp)\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/dashscope/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms dashscope integration",
    "version": "0.2.2",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "0b475ba00097bac8d2995ccd4a629942c07b5bfd58a75133da0deb00eb5c310c",
                "md5": "2c18a22f3e261577e23f791ceba014f3",
                "sha256": "96d58bd979983024f938cb17d2b07477d010bff41fb5406288e2e8f886f07ced"
            },
            "downloads": -1,
            "filename": "llama_index_llms_dashscope-0.2.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "2c18a22f3e261577e23f791ceba014f3",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8.1",
            "size": 5590,
            "upload_time": "2024-10-08T22:26:03",
            "upload_time_iso_8601": "2024-10-08T22:26:03.949764Z",
            "url": "https://files.pythonhosted.org/packages/0b/47/5ba00097bac8d2995ccd4a629942c07b5bfd58a75133da0deb00eb5c310c/llama_index_llms_dashscope-0.2.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "1d59678e38ec687ed9eff053715574a8cc370aa2ad77991fc5a21115d676b36d",
                "md5": "3e91fecfa16441a18889bb77a2551a7f",
                "sha256": "9c15de1f01fe1d1fb44d3680447dd6f90cf0f259eb95ec557f9d53602acad211"
            },
            "downloads": -1,
            "filename": "llama_index_llms_dashscope-0.2.2.tar.gz",
            "has_sig": false,
            "md5_digest": "3e91fecfa16441a18889bb77a2551a7f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8.1",
            "size": 4789,
            "upload_time": "2024-10-08T22:26:05",
            "upload_time_iso_8601": "2024-10-08T22:26:05.628483Z",
            "url": "https://files.pythonhosted.org/packages/1d/59/678e38ec687ed9eff053715574a8cc370aa2ad77991fc5a21115d676b36d/llama_index_llms_dashscope-0.2.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-08 22:26:05",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-dashscope"
}
        
Elapsed time: 0.54069s