<div align="center">
<img src="docs/_static/image/aoe-logo.svg" width="250"/>
[![PyPI](https://img.shields.io/pypi/v/Open-AOE)](https://pypi.org/project/Open-AOE)
[![Docker Image Version (tag latest semver)](https://img.shields.io/docker/v/opensealion/open-aoe?label=docker)](https://hub.docker.com/r/opensealion/open-aoe?label=docker)
English | [简体中文](docs/README_zh-CN.md)
</div>
## Latest Progress 🎉
- \[January 2024\] Released version v0.0.1, officially open-sourced!
______________________________________________________________________
# Introduction
## What is Open-AOE?
AOE, an acronym from DOTA2 for Area of Effect, denotes an ability that can affect a group of targets within a certain area.
Here, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt.
![](docs/_static/gif/aoe-en.gif)
## What problem does Open-AOE want to solve?
Currently, there are many open-source frameworks based on the ChatGPT for chat, but the open-sourced
single-prompt-multiple-LLM-responses framework is still not coming yet.
The emergence of AOE fills this gap:
AOE can help large model algorithm researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source large models, providing both single model serial response and multi-model parallel response modes.
## What can you get from Open-AOE?
1. Send a prompt once to one or more large language models and get their return.
2. Provides access to commercial large model APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also supports user-defined access to other large model APIs.
3. Provides access to open-source large model APIs; users can use [LMDeploy](https://github.com/InternLM/lmdeploy) to deploy open-source large models with one click.
4. We also provide backend APIs and a WEB side to meet the needs of different users.
# Quick Run
We will provide three different ways to run open-aoe: run by pip, run by docker and run by source code as well.
## Run by pip
> [!TIP]
> Require python >= 3.9
### **Install**
```shell
pip install -U open-aoe
```
### **Start**
```shell
open-aoe -f /path/to/your/config-template.yaml
```
## Run by docker
### **Install**
There are two ways to get the Open-AOE docker image by:
1. pull the open-aoe docker image
```shell
docker pull open-aoe:latest
```
2. or build a docker image
```shell
git clone https://github.com/internlm/Open-AOE
cd open-aoe
docker build . -f docker/Dockerfile -t open-aoe:latest
```
### **Start**
```shell
docker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config-template.yaml --name Open-AOE open-aoe:latest
```
## Run by source code
### **Install**
1. clone this project
```shell
git clone https://github.com/internlm/Open-AOE
```
2. [_optional_] build the frontend project when the frontend codes are changed
```shell
cd open-aoe/openaoe/frontend
npm install
npm run build
```
### **Start**
```shell
cd open-aoe/openaoe
pip install -r backend/requirements.txt
python -m main -f /path/to/your/config-template.yaml
```
> [!TIP]
> `/path/to/your/config.yaml` is the configuration file read by Open-AOE at startup,
> containing the relevant configuration information for the large model,
> including: API URLs, AKSKs, Tokens, etc., which are required for the startup of Open-AOE.
> A template file can be found in `openaoe/backend/config/config.yaml`.
# Tech Introduction
> **You are always welcome to fork this project and contribute your work**
If you want to add more LLMs' APIs or features based on Open-AOE, the following info might be helpful.
## Tech Stack
The technology stack we use includes:
1. Backend framework based on python + fastapi;
2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.
3. Build tools:
1. conda: quickly create a virtual python env to install necessary packages
2. npm: build the frontend project
> [!TIP]
> The build tools can be installed by [sealion-cli](https://github.com/opensealion/sealion-cli) by `pip install -U sealion-cli`
## Organization of the Repo
- Frontend code in `openaoe/frontend`
- Backend code in `openaoe/backend`
- Project entry-point is `openaoe/main.py`
## How to add a new model
### Frontend
- Add new model info like `name`, `avatar`, `provider`, etc in `openaoe/frontend/src/config/model-config.ts`
- Add a new model basic API request payload configuration in `openaoe/frontend/src/config/api-config.ts`
- Modify your new model's payload specifically in `openaoe/frontend/src/services/fetch.ts`, you may need to change the payload structure and handle corner cases according to your model's API definition.
Raw data
{
"_id": null,
"home_page": "https://github.com/internlm/open-aoe",
"name": "open-aoe",
"maintainer": "",
"docs_url": null,
"requires_python": "",
"maintainer_email": "",
"keywords": "openaoe,open-aoe",
"author": "arkmon",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/12/f8/c2c1c4db79f467d7d5664256161524a3707dde7bb61b235b53fea397f133/open-aoe-0.0.1.post65.tar.gz",
"platform": null,
"description": "<div align=\"center\">\n <img src=\"docs/_static/image/aoe-logo.svg\" width=\"250\"/>\n\n[![PyPI](https://img.shields.io/pypi/v/Open-AOE)](https://pypi.org/project/Open-AOE)\n[![Docker Image Version (tag latest semver)](https://img.shields.io/docker/v/opensealion/open-aoe?label=docker)](https://hub.docker.com/r/opensealion/open-aoe?label=docker)\n\n\n\nEnglish | [\u7b80\u4f53\u4e2d\u6587](docs/README_zh-CN.md)\n\n</div>\n\n\n## Latest Progress \ud83c\udf89\n\n- \\[January 2024\\] Released version v0.0.1, officially open-sourced\uff01\n______________________________________________________________________\n\n# Introduction\n## What is Open-AOE?\nAOE, an acronym from DOTA2 for Area of Effect, denotes an ability that can affect a group of targets within a certain area.\nHere, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt.\n\n![](docs/_static/gif/aoe-en.gif)\n\n\n## What problem does Open-AOE want to solve?\nCurrently, there are many open-source frameworks based on the ChatGPT for chat, but the open-sourced \nsingle-prompt-multiple-LLM-responses framework is still not coming yet.\n\nThe emergence of AOE fills this gap:\nAOE can help large model algorithm researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source large models, providing both single model serial response and multi-model parallel response modes.\n\n\n\n## What can you get from Open-AOE?\n1. Send a prompt once to one or more large language models and get their return.\n2. Provides access to commercial large model APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also supports user-defined access to other large model APIs.\n3. Provides access to open-source large model APIs; users can use [LMDeploy](https://github.com/InternLM/lmdeploy) to deploy open-source large models with one click.\n4. We also provide backend APIs and a WEB side to meet the needs of different users.\n\n\n\n\n\n# Quick Run\nWe will provide three different ways to run open-aoe: run by pip\uff0c run by docker and run by source code as well.\n\n\n## Run by pip \n> [!TIP]\n> Require python >= 3.9\n### **Install**\n```shell\npip install -U open-aoe \n```\n### **Start**\n```shell\nopen-aoe -f /path/to/your/config-template.yaml\n```\n\n## Run by docker\n### **Install**\n\nThere are two ways to get the Open-AOE docker image by:\n1. pull the open-aoe docker image\n```shell\ndocker pull open-aoe:latest\n```\n\n2. or build a docker image\n```shell\ngit clone https://github.com/internlm/Open-AOE\ncd open-aoe\ndocker build . -f docker/Dockerfile -t open-aoe:latest\n```\n\n### **Start**\n```shell\ndocker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config-template.yaml --name Open-AOE open-aoe:latest\n```\n\n## Run by source code\n### **Install**\n1. clone this project\n```shell\ngit clone https://github.com/internlm/Open-AOE\n```\n2. [_optional_] build the frontend project when the frontend codes are changed\n```shell\ncd open-aoe/openaoe/frontend\nnpm install\nnpm run build\n```\n\n\n### **Start**\n```shell\ncd open-aoe/openaoe\npip install -r backend/requirements.txt\npython -m main -f /path/to/your/config-template.yaml\n```\n\n\n> [!TIP]\n> `/path/to/your/config.yaml` is the configuration file read by Open-AOE at startup, \n> containing the relevant configuration information for the large model, \n> including: API URLs, AKSKs, Tokens, etc., which are required for the startup of Open-AOE. \n> A template file can be found in `openaoe/backend/config/config.yaml`.\n\n\n# Tech Introduction\n> **You are always welcome to fork this project and contribute your work**\n\nIf you want to add more LLMs' APIs or features based on Open-AOE, the following info might be helpful.\n\n## Tech Stack\nThe technology stack we use includes:\n\n1. Backend framework based on python + fastapi;\n2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.\n3. Build tools:\n 1. conda: quickly create a virtual python env to install necessary packages\n 2. npm: build the frontend project\n\n> [!TIP]\n> The build tools can be installed by [sealion-cli](https://github.com/opensealion/sealion-cli) by `pip install -U sealion-cli`\n\n## Organization of the Repo\n- Frontend code in `openaoe/frontend`\n- Backend code in `openaoe/backend`\n- Project entry-point is `openaoe/main.py`\n\n## How to add a new model\n### Frontend\n- Add new model info like `name`, `avatar`, `provider`, etc in `openaoe/frontend/src/config/model-config.ts`\n- Add a new model basic API request payload configuration in `openaoe/frontend/src/config/api-config.ts`\n- Modify your new model's payload specifically in `openaoe/frontend/src/services/fetch.ts`, you may need to change the payload structure and handle corner cases according to your model's API definition.\n",
"bugtrack_url": null,
"license": "Apache License 2.0",
"summary": "LLM Group Chat Framework: chat with multiple LLMs at the same time",
"version": "0.0.1.post65",
"project_urls": {
"Homepage": "https://github.com/internlm/open-aoe"
},
"split_keywords": [
"openaoe",
"open-aoe"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "aae422db3d22f14e21894e90c16e29cc5c46174a247efe056fcd79d9e5cf77d3",
"md5": "5a2af3321737aa93b4a6fd9c4c5fe8dc",
"sha256": "537ae2ef1b526ae9e92bad06e2f92a28ffeb032d2d348dfee54c520eb4946329"
},
"downloads": -1,
"filename": "open_aoe-0.0.1.post65-py3-none-any.whl",
"has_sig": false,
"md5_digest": "5a2af3321737aa93b4a6fd9c4c5fe8dc",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": null,
"size": 4724116,
"upload_time": "2024-01-15T12:23:33",
"upload_time_iso_8601": "2024-01-15T12:23:33.613751Z",
"url": "https://files.pythonhosted.org/packages/aa/e4/22db3d22f14e21894e90c16e29cc5c46174a247efe056fcd79d9e5cf77d3/open_aoe-0.0.1.post65-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "12f8c2c1c4db79f467d7d5664256161524a3707dde7bb61b235b53fea397f133",
"md5": "b79ad8081a289ff54ef917d38a247822",
"sha256": "de4bcfefb48d42da3c718d875c6fbf1d600e827edb7b1df45a8532d4dc17a90b"
},
"downloads": -1,
"filename": "open-aoe-0.0.1.post65.tar.gz",
"has_sig": false,
"md5_digest": "b79ad8081a289ff54ef917d38a247822",
"packagetype": "sdist",
"python_version": "source",
"requires_python": null,
"size": 4690582,
"upload_time": "2024-01-15T12:23:38",
"upload_time_iso_8601": "2024-01-15T12:23:38.921183Z",
"url": "https://files.pythonhosted.org/packages/12/f8/c2c1c4db79f467d7d5664256161524a3707dde7bb61b235b53fea397f133/open-aoe-0.0.1.post65.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-01-15 12:23:38",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "internlm",
"github_project": "open-aoe",
"github_not_found": true,
"lcname": "open-aoe"
}