| Name | llama-index-llms-zhipuai JSON |
| Version |
0.1.1
JSON |
| download |
| home_page | None |
| Summary | llama-index llms zhipuai integration |
| upload_time | 2024-10-15 12:30:14 |
| maintainer | None |
| docs_url | None |
| author | nightosong |
| requires_python | <4.0,>=3.8.1 |
| license | MIT |
| keywords |
|
| VCS |
|
| bugtrack_url |
|
| requirements |
No requirements were recorded.
|
| Travis-CI |
No Travis.
|
| coveralls test coverage |
No coveralls.
|
# LlamaIndex Llms Integration: ZhipuAI
### Installation
```bash
%pip install llama-index-llms-zhipuai
!pip install llama-index
```
### Basic usage
```py
# Import ZhipuAI
from llama_index.llms.zhipuai import ZhipuAI
# Set your API key
api_key = "Your API KEY"
# Call complete function
response = ZhipuAI(model="glm-4", api_key=api_key).complete("who are you")
print(response)
# Output
# I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.
# Call chat with a list of messages
from llama_index.core.llms import ChatMessage
messages = [
ChatMessage(role="user", content="who are you"),
]
response = ZhipuAI(model="glm-4", api_key=api_key).chat(messages)
print(response)
# Output
# assistant: I am an AI assistant named ZhiPuQingYan(智谱清言), you can call me Xiaozhi🤖, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.
```
### Streaming: Using stream endpoint
```py
from llama_index.llms.zhipuai import ZhipuAI
llm = ZhipuAI(model="glm-4", api_key=api_key)
# Using stream_complete endpoint
response = llm.stream_complete("who are you")
for r in response:
print(r.delta, end="")
# Using stream_chat endpoint
messages = [
ChatMessage(role="user", content="who are you"),
]
response = llm.stream_chat(messages)
for r in response:
print(r.delta, end="")
```
### Function Calling
```py
from llama_index.llms.zhipuai import ZhipuAI
llm = ZhipuAI(model="glm-4", api_key="YOUR API KEY")
tools = [
{
"type": "function",
"function": {
"name": "query_weather",
"description": "Query the weather of the city provided by user",
"parameters": {
"type": "object",
"properties": {
"city": {
"type": "string",
"description": "City to query",
},
},
"required": ["city"],
},
},
}
]
response = llm.complete(
"help me to find the weather in Shanghai",
tools=tools,
tool_choice="auto",
)
print(llm.get_tool_calls_from_response(response))
# Output
# [ToolSelection(tool_id='call_9097928240216277928', tool_name='query_weather', tool_kwargs={'city': 'Shanghai'})]
```
### ZhipuAI Documentation
usage: https://bigmodel.cn/dev/howuse/introduction
api: https://bigmodel.cn/dev/api/normal-model/glm-4
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-llms-zhipuai",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": null,
"author": "nightosong",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/22/4d/e2326c70220066dca72e4855d30b9d0e468a99fbcc3fe1c3c14944d9211b/llama_index_llms_zhipuai-0.1.1.tar.gz",
"platform": null,
"description": "# LlamaIndex Llms Integration: ZhipuAI\n\n### Installation\n\n```bash\n%pip install llama-index-llms-zhipuai\n!pip install llama-index\n```\n\n### Basic usage\n\n```py\n# Import ZhipuAI\nfrom llama_index.llms.zhipuai import ZhipuAI\n\n# Set your API key\napi_key = \"Your API KEY\"\n\n# Call complete function\nresponse = ZhipuAI(model=\"glm-4\", api_key=api_key).complete(\"who are you\")\nprint(response)\n\n# Output\n# I am an AI assistant named ZhiPuQingYan\uff08\u667a\u8c31\u6e05\u8a00\uff09, you can call me Xiaozhi\ud83e\udd16, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.\n\n# Call chat with a list of messages\nfrom llama_index.core.llms import ChatMessage\n\nmessages = [\n ChatMessage(role=\"user\", content=\"who are you\"),\n]\n\nresponse = ZhipuAI(model=\"glm-4\", api_key=api_key).chat(messages)\nprint(response)\n\n# Output\n# assistant: I am an AI assistant named ZhiPuQingYan\uff08\u667a\u8c31\u6e05\u8a00\uff09, you can call me Xiaozhi\ud83e\udd16, which is developed based on the language model jointly trained by Tsinghua University KEG Lab and Zhipu AI Company in 2023. My job is to provide appropriate answers and support to users' questions and requests.\n```\n\n### Streaming: Using stream endpoint\n\n```py\nfrom llama_index.llms.zhipuai import ZhipuAI\n\nllm = ZhipuAI(model=\"glm-4\", api_key=api_key)\n\n# Using stream_complete endpoint\nresponse = llm.stream_complete(\"who are you\")\nfor r in response:\n print(r.delta, end=\"\")\n\n# Using stream_chat endpoint\nmessages = [\n ChatMessage(role=\"user\", content=\"who are you\"),\n]\n\nresponse = llm.stream_chat(messages)\nfor r in response:\n print(r.delta, end=\"\")\n```\n\n### Function Calling\n\n```py\nfrom llama_index.llms.zhipuai import ZhipuAI\n\nllm = ZhipuAI(model=\"glm-4\", api_key=\"YOUR API KEY\")\ntools = [\n {\n \"type\": \"function\",\n \"function\": {\n \"name\": \"query_weather\",\n \"description\": \"Query the weather of the city provided by user\",\n \"parameters\": {\n \"type\": \"object\",\n \"properties\": {\n \"city\": {\n \"type\": \"string\",\n \"description\": \"City to query\",\n },\n },\n \"required\": [\"city\"],\n },\n },\n }\n]\nresponse = llm.complete(\n \"help me to find the weather in Shanghai\",\n tools=tools,\n tool_choice=\"auto\",\n)\nprint(llm.get_tool_calls_from_response(response))\n\n# Output\n# [ToolSelection(tool_id='call_9097928240216277928', tool_name='query_weather', tool_kwargs={'city': 'Shanghai'})]\n```\n\n### ZhipuAI Documentation\n\nusage: https://bigmodel.cn/dev/howuse/introduction\n\napi: https://bigmodel.cn/dev/api/normal-model/glm-4\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index llms zhipuai integration",
"version": "0.1.1",
"project_urls": null,
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "4414b3e7709946a132fa1b6782f5f236de6ba5ee265b7e652575d7a7340b293d",
"md5": "56726215372d2cb0917aa8727872c2a6",
"sha256": "560477e88e7d5373d5ce5978968c639c8bd9b36fbc0899f28fc3def17bf53469"
},
"downloads": -1,
"filename": "llama_index_llms_zhipuai-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "56726215372d2cb0917aa8727872c2a6",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 5418,
"upload_time": "2024-10-15T12:30:13",
"upload_time_iso_8601": "2024-10-15T12:30:13.589801Z",
"url": "https://files.pythonhosted.org/packages/44/14/b3e7709946a132fa1b6782f5f236de6ba5ee265b7e652575d7a7340b293d/llama_index_llms_zhipuai-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "224de2326c70220066dca72e4855d30b9d0e468a99fbcc3fe1c3c14944d9211b",
"md5": "f949ab19135124134e5f4b4432f5069d",
"sha256": "4310558f8d4f1be26d9c51a7c48d3cb4983917376aa03233841c3627ab0e0441"
},
"downloads": -1,
"filename": "llama_index_llms_zhipuai-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "f949ab19135124134e5f4b4432f5069d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 5023,
"upload_time": "2024-10-15T12:30:14",
"upload_time_iso_8601": "2024-10-15T12:30:14.690362Z",
"url": "https://files.pythonhosted.org/packages/22/4d/e2326c70220066dca72e4855d30b9d0e468a99fbcc3fe1c3c14944d9211b/llama_index_llms_zhipuai-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-10-15 12:30:14",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-llms-zhipuai"
}