# ChatAgent
[中文说明](./README_CN.md)
A Python-based large language model agent framework.
The online agents deployed through ChatAgent have provided over a million stable API calls for the internal OpenRL team.
## Features
- [x] Supports multimodal large language models
- [x] Supports OpenAI API
- [x] Supports API calls to Qwen on Alibaba Cloud, Zhipu AI's GLM, Microsoft Azure, etc.
- [x] Supports parallel and sequential calls of different agents
- [x] Supports adding an api key for access control
- [x] Supports setting a maximum number of concurrent requests, i.e., the maximum number of requests a model can handle at the same time
- [x] Supports customizing complex agent interaction strategies
## Installation
```bash
pip install ChatAgent
```
## Usage
We provide some examples in the `examples` directory, which you can run them directly to explore ChatAgent's abilities.
### 1. Example for Qwen/ZhiPu API to OpenAI API
With just over a dozen lines of code, you can convert the Qwen/ZhiPu API to the OpenAI API.
For specific code and test cases, please refer to [examples/qwen2openai](./examples/qwen2openai) and [examples/glm2openai](./examples/glm2openai).
```python
import os
from ChatAgent import serve
from ChatAgent.chat_models.base_chat_model import BaseChatModel
from ChatAgent.agents.dashscope_chat_agent import DashScopeChatAgent
from ChatAgent.protocol.openai_api_protocol import MultimodalityChatCompletionRequest
class QwenMax(BaseChatModel):
def init_agent(self):
self.agent = DashScopeChatAgent(model_name='qwen-max',api_key=os.getenv("QWEN_API_KEY"))
def create_chat_completion(self, request):
return self.agent.act(request)
@serve.create_chat_completion()
async def implement_completions(request: MultimodalityChatCompletionRequest):
return QwenMax().create_chat_completion(request)
serve.run(host="0.0.0.0", port=6367)
```
### 2. Ensemble with Multiple Agents
We provide an example in [examples/multiagent_ensemble](./examples/multiagent_ensemble) where multiple agents perform ensemble to answer user questions.
### 3. Agent Q&A Based on RAG Query Results
We provide an example in [examples/rag](./examples/rag) of agent Q&A based on RAG query results.
## Citation
If you use ChatAgent, please cite us:
```bibtex
@misc{ChatAgent2024,
title={ChatAgent},
author={Shiyu Huang},
publisher = {GitHub},
howpublished = {\url{https://github.com/OpenRL-Lab/ChatAgent}},
year={2024},
}
```
Raw data
{
"_id": null,
"home_page": "https://github.com/OpenRL-Lab/ChatAgent",
"name": "ChatAgent-python",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": "",
"keywords": "OpenAIAPILLMAgent",
"author": "Shiyu Huang",
"author_email": "huangsy1314@163.com",
"download_url": "https://files.pythonhosted.org/packages/3b/a3/1c1a077524564a431c1d6f64f2712017e21a9dabe3bc64534b717998ac3d/ChatAgent-python-0.0.1.tar.gz",
"platform": null,
"description": "# ChatAgent\n\n[\u4e2d\u6587\u8bf4\u660e](./README_CN.md)\n\nA Python-based large language model agent framework. \nThe online agents deployed through ChatAgent have provided over a million stable API calls for the internal OpenRL team.\n\n## Features\n\n- [x] Supports multimodal large language models\n- [x] Supports OpenAI API\n- [x] Supports API calls to Qwen on Alibaba Cloud, Zhipu AI's GLM, Microsoft Azure, etc.\n- [x] Supports parallel and sequential calls of different agents\n- [x] Supports adding an api key for access control\n- [x] Supports setting a maximum number of concurrent requests, i.e., the maximum number of requests a model can handle at the same time\n- [x] Supports customizing complex agent interaction strategies\n\n## Installation\n\n```bash\npip install ChatAgent\n```\n\n## Usage\n\nWe provide some examples in the `examples` directory, which you can run them directly to explore ChatAgent's abilities.\n\n### 1. Example for Qwen/ZhiPu API to OpenAI API\n\nWith just over a dozen lines of code, you can convert the Qwen/ZhiPu API to the OpenAI API. \nFor specific code and test cases, please refer to [examples/qwen2openai](./examples/qwen2openai) and [examples/glm2openai](./examples/glm2openai).\n```python\nimport os\nfrom ChatAgent import serve\nfrom ChatAgent.chat_models.base_chat_model import BaseChatModel\nfrom ChatAgent.agents.dashscope_chat_agent import DashScopeChatAgent\nfrom ChatAgent.protocol.openai_api_protocol import MultimodalityChatCompletionRequest\nclass QwenMax(BaseChatModel):\n def init_agent(self):\n self.agent = DashScopeChatAgent(model_name='qwen-max',api_key=os.getenv(\"QWEN_API_KEY\"))\n def create_chat_completion(self, request):\n return self.agent.act(request)\n@serve.create_chat_completion()\nasync def implement_completions(request: MultimodalityChatCompletionRequest):\n return QwenMax().create_chat_completion(request)\nserve.run(host=\"0.0.0.0\", port=6367)\n```\n\n### 2. Ensemble with Multiple Agents\n\nWe provide an example in [examples/multiagent_ensemble](./examples/multiagent_ensemble) where multiple agents perform ensemble to answer user questions.\n\n### 3. Agent Q&A Based on RAG Query Results\n\nWe provide an example in [examples/rag](./examples/rag) of agent Q&A based on RAG query results.\n\n## Citation\n\nIf you use ChatAgent, please cite us:\n```bibtex\n@misc{ChatAgent2024,\n title={ChatAgent},\n author={Shiyu Huang},\n publisher = {GitHub},\n howpublished = {\\url{https://github.com/OpenRL-Lab/ChatAgent}},\n year={2024},\n}\n```\n",
"bugtrack_url": null,
"license": "",
"summary": "Pure Python Based Agents for Large Language Models",
"version": "0.0.1",
"project_urls": {
"Code": "https://github.com/OpenRL-Lab/ChatAgent",
"Homepage": "https://github.com/OpenRL-Lab/ChatAgent"
},
"split_keywords": [
"openaiapillmagent"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "405883fa90592703d757746e53dba41e53b2e66f65f705096d9424ec1b818e66",
"md5": "265dc763e2bd4b0ba37c603f09bc6a4a",
"sha256": "adb1b9adaa64b9aca5fb81a9bd42a582ced064347b31795676afec362a1880ab"
},
"downloads": -1,
"filename": "ChatAgent_python-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "265dc763e2bd4b0ba37c603f09bc6a4a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 52278,
"upload_time": "2024-02-01T08:47:58",
"upload_time_iso_8601": "2024-02-01T08:47:58.796200Z",
"url": "https://files.pythonhosted.org/packages/40/58/83fa90592703d757746e53dba41e53b2e66f65f705096d9424ec1b818e66/ChatAgent_python-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "3ba31c1a077524564a431c1d6f64f2712017e21a9dabe3bc64534b717998ac3d",
"md5": "a9d79818d3e15b7a89478554c45ae2d2",
"sha256": "163ee24b9fb9c25bbe77de2e27f1d9e4ef9d0e20a0f535d4a973ee7dfe2655bd"
},
"downloads": -1,
"filename": "ChatAgent-python-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "a9d79818d3e15b7a89478554c45ae2d2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 24391,
"upload_time": "2024-02-01T08:48:01",
"upload_time_iso_8601": "2024-02-01T08:48:01.204164Z",
"url": "https://files.pythonhosted.org/packages/3b/a3/1c1a077524564a431c1d6f64f2712017e21a9dabe3bc64534b717998ac3d/ChatAgent-python-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-01 08:48:01",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "OpenRL-Lab",
"github_project": "ChatAgent",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "chatagent-python"
}