openaoe


Nameopenaoe JSON
Version 0.0.6 PyPI version JSON
download
home_pagehttps://github.com/internlm/openaoe
SummaryLLM Group Chat Framework: chat with multiple LLMs at the same time
upload_time2024-03-05 13:39:50
maintainer
docs_urlNone
authorarkmon
requires_python
licenseApache License 2.0
keywords openaoe open-aoe
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <div align="center">
  <img src="docs/_static/image/aoe-logo.svg" width="250"/>

[![PyPI](https://img.shields.io/pypi/v/OpenAOE)](https://pypi.org/project/OpenAOE)
[![Docker Image Version (tag latest semver)](https://img.shields.io/docker/v/opensealion/openaoe?label=docker)](https://hub.docker.com/r/opensealion/openaoe?label=docker)
![GitHub commit activity](https://img.shields.io/github/commit-activity/w/internlm/openaoe?label=commits)
![PyPI - Downloads](https://img.shields.io/pypi/dw/openaoe?label=pip%20downloads)





English | [简体中文](docs/README_zh-CN.md)

</div>


## Latest Progress 🎉
- \[February 2024\] Add mistral-7b model
- \[February 2024\] Add gemini-pro model
- \[January 2024\] refactor the config-template.yaml to control the backend and the frontend settings at the same time, [click](https://github.com/InternLM/OpenAOE/blob/main/docs/tech-report/config-template.md) to find more introduction about the `config-template.yaml`
- \[January 2024\] Add internlm2-chat-7b model
- \[January 2024\] Released version v0.0.1, officially open source!
______________________________________________________________________

# Introduction
## What is OpenAOE?
AOE, an acronym from DOTA2 for Area Of Effect, denotes an ability that can affect a group of targets within a certain area.
Here, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt at the same time.

![](docs/_static/gif/aoe-en.gif)


## What problem does OpenAOE want to solve?
Currently, there are many open-source frameworks based on the ChatGPT for chat, but the LGC(LLM Group Chat) framework is still not coming yet.

The emergence of OpenAOE fills this gap:
OpenAOE can help LLM researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source LLMs, providing both single model serial response mode and multi-models parallel response mode.



## What can you get from OpenAOE?
OpenAOE can:
1. return one or more LLMs' answers **at the same time** by a single prompt.
2. provide access to commercial LLM APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also support user-defined access to other large model APIs. (API keys need to be prepared in advanced)
3. provide access to open-source LLM APIs. ( We recommend to use [LMDeploy](https://github.com/InternLM/lmdeploy) to deploy with one click)
4. provide backend APIs and a WEB-UI to meet the needs of different requirements.



# Quick Run
> [!TIP]
> Require python >= 3.9

We provide three different ways to run OpenAOE: `run by pip`, `run by docker` and `run by source code` as well.

## Run by pip 
### **Install**
```shell
pip install -U openaoe 
```
### **Start**
```shell
openaoe -f /path/to/your/config-template.yaml
```

## Run by docker
### **Install**

There are two ways to get the OpenAOE docker image by:
1. pull the OpenAOE docker image
```shell
docker pull opensealion/openaoe:latest
```

2. or build a docker image
```shell
git clone https://github.com/internlm/OpenAOE
cd OpenAOE
docker build . -f docker/Dockerfile -t opensealion/openaoe:latest
```

### **Start**
```shell
docker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config.yaml --name OpenAOE opensealion/openaoe:latest
```

## Run by source code
### **Install**
1. clone this project
```shell
git clone https://github.com/internlm/OpenAOE
```
2. [_optional_] build the frontend project when the frontend codes are changed
```shell
cd OpenAOE/openaoe/frontend
npm install
npm run build
```


### **Start**
```shell
cd OpenAOE # this OpenAOE is the clone directory
pip install -r openaoe/backend/requirements.txt
python -m openaoe.main -f /path/to/your/config-template.yaml
```


> [!TIP]
> `/path/to/your/config-template.yaml` is a configuration file loaded by OpenAOE at startup, 
> which contains the relevant configuration information for the LLMs,
> including: API URLs, AKSKs, Tokens, etc.
> A template configuration yaml file can be found in `openaoe/backend/config/config-template.yaml`.
> Note that, this `config-template.yaml` DOES NOT contain any API access data, you should add them by yourself.


#  Tech Report
> **You are always welcome to fork this project to contribute your work**
> **and find the [TODOs in furture](docs/todo/TODO.md).**

If you want to add more LLMs' APIs or features based on OpenAOE, the following info might be helpful.

## Tech Stack
The technology stack we use includes:

1. Backend framework based on python + fastapi;
2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.
3. Build tools:
   1. conda: quickly create a virtual python env to install necessary packages
   2. npm: build the frontend project

> [!TIP]
> The build tools can be installed quickly by `pip install -U sealion-cli`

## Organization of the Repo
- Frontend codes are in `openaoe/frontend`
- Backend codes are in `openaoe/backend`
- Project entry-point is `openaoe/main.py`

## How to add a new model
### Frontend
- Add new model info like `name`, `avatar`, `provider`, etc in `openaoe/frontend/src/config/model-config.ts`
- Add a new model basic API request payload configuration in `openaoe/frontend/src/config/api-config.ts`
- Modify your new model's payload specifically in `openaoe/frontend/src/services/fetch.ts`, you may need to change the payload structure and handle corner cases according to your model's API definition.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/internlm/openaoe",
    "name": "openaoe",
    "maintainer": "",
    "docs_url": null,
    "requires_python": "",
    "maintainer_email": "",
    "keywords": "openaoe,open-aoe",
    "author": "arkmon",
    "author_email": "",
    "download_url": "https://files.pythonhosted.org/packages/80/e4/ffb871615ce71bf0a3f712e83a0e751930f7472049739acbedbe0fbc3a35/openaoe-0.0.6.tar.gz",
    "platform": null,
    "description": "<div align=\"center\">\n  <img src=\"docs/_static/image/aoe-logo.svg\" width=\"250\"/>\n\n[![PyPI](https://img.shields.io/pypi/v/OpenAOE)](https://pypi.org/project/OpenAOE)\n[![Docker Image Version (tag latest semver)](https://img.shields.io/docker/v/opensealion/openaoe?label=docker)](https://hub.docker.com/r/opensealion/openaoe?label=docker)\n![GitHub commit activity](https://img.shields.io/github/commit-activity/w/internlm/openaoe?label=commits)\n![PyPI - Downloads](https://img.shields.io/pypi/dw/openaoe?label=pip%20downloads)\n\n\n\n\n\nEnglish | [\u7b80\u4f53\u4e2d\u6587](docs/README_zh-CN.md)\n\n</div>\n\n\n## Latest Progress \ud83c\udf89\n- \\[February 2024\\] Add mistral-7b model\n- \\[February 2024\\] Add gemini-pro model\n- \\[January 2024\\] refactor the config-template.yaml to control the backend and the frontend settings at the same time, [click](https://github.com/InternLM/OpenAOE/blob/main/docs/tech-report/config-template.md) to find more introduction about the `config-template.yaml`\n- \\[January 2024\\] Add internlm2-chat-7b model\n- \\[January 2024\\] Released version v0.0.1, officially open source\uff01\n______________________________________________________________________\n\n# Introduction\n## What is OpenAOE?\nAOE, an acronym from DOTA2 for Area Of Effect, denotes an ability that can affect a group of targets within a certain area.\nHere, AOE in AI implies that user can obtain parallel outputs from multiple LLMs with one single prompt at the same time.\n\n![](docs/_static/gif/aoe-en.gif)\n\n\n## What problem does OpenAOE want to solve?\nCurrently, there are many open-source frameworks based on the ChatGPT for chat, but the LGC(LLM Group Chat) framework is still not coming yet.\n\nThe emergence of OpenAOE fills this gap:\nOpenAOE can help LLM researchers, evaluators, engineering developers, and even non-professionals to quickly access the market's well-known commercial and open-source LLMs, providing both single model serial response mode and multi-models parallel response mode.\n\n\n\n## What can you get from OpenAOE?\nOpenAOE can:\n1. return one or more LLMs' answers **at the same time** by a single prompt.\n2. provide access to commercial LLM APIs, with default support for gpt3.5, gpt4, Google Palm, Minimax, Claude, Spark, etc., and also support user-defined access to other large model APIs. (API keys need to be prepared in advanced)\n3. provide access to open-source LLM APIs. ( We recommend to use [LMDeploy](https://github.com/InternLM/lmdeploy) to deploy with one click)\n4. provide backend APIs and a WEB-UI to meet the needs of different requirements.\n\n\n\n# Quick Run\n> [!TIP]\n> Require python >= 3.9\n\nWe provide three different ways to run OpenAOE: `run by pip`\uff0c `run by docker` and `run by source code` as well.\n\n## Run by pip \n### **Install**\n```shell\npip install -U openaoe \n```\n### **Start**\n```shell\nopenaoe -f /path/to/your/config-template.yaml\n```\n\n## Run by docker\n### **Install**\n\nThere are two ways to get the OpenAOE docker image by:\n1. pull the OpenAOE docker image\n```shell\ndocker pull opensealion/openaoe:latest\n```\n\n2. or build a docker image\n```shell\ngit clone https://github.com/internlm/OpenAOE\ncd OpenAOE\ndocker build . -f docker/Dockerfile -t opensealion/openaoe:latest\n```\n\n### **Start**\n```shell\ndocker run -p 10099:10099 -v /path/to/your/config-template.yaml:/app/config.yaml --name OpenAOE opensealion/openaoe:latest\n```\n\n## Run by source code\n### **Install**\n1. clone this project\n```shell\ngit clone https://github.com/internlm/OpenAOE\n```\n2. [_optional_] build the frontend project when the frontend codes are changed\n```shell\ncd OpenAOE/openaoe/frontend\nnpm install\nnpm run build\n```\n\n\n### **Start**\n```shell\ncd OpenAOE # this OpenAOE is the clone directory\npip install -r openaoe/backend/requirements.txt\npython -m openaoe.main -f /path/to/your/config-template.yaml\n```\n\n\n> [!TIP]\n> `/path/to/your/config-template.yaml` is a configuration file loaded by OpenAOE at startup, \n> which contains the relevant configuration information for the LLMs,\n> including: API URLs, AKSKs, Tokens, etc.\n> A template configuration yaml file can be found in `openaoe/backend/config/config-template.yaml`.\n> Note that, this `config-template.yaml` DOES NOT contain any API access data, you should add them by yourself.\n\n\n#  Tech Report\n> **You are always welcome to fork this project to contribute your work**\n> **and find the [TODOs in furture](docs/todo/TODO.md).**\n\nIf you want to add more LLMs' APIs or features based on OpenAOE, the following info might be helpful.\n\n## Tech Stack\nThe technology stack we use includes:\n\n1. Backend framework based on python + fastapi;\n2. Frontend framework based on typescript + Sealion-Client (encapsulated based on React) + Sealion-UI.\n3. Build tools:\n   1. conda: quickly create a virtual python env to install necessary packages\n   2. npm: build the frontend project\n\n> [!TIP]\n> The build tools can be installed quickly by `pip install -U sealion-cli`\n\n## Organization of the Repo\n- Frontend codes are in `openaoe/frontend`\n- Backend codes are in `openaoe/backend`\n- Project entry-point is `openaoe/main.py`\n\n## How to add a new model\n### Frontend\n- Add new model info like `name`, `avatar`, `provider`, etc in `openaoe/frontend/src/config/model-config.ts`\n- Add a new model basic API request payload configuration in `openaoe/frontend/src/config/api-config.ts`\n- Modify your new model's payload specifically in `openaoe/frontend/src/services/fetch.ts`, you may need to change the payload structure and handle corner cases according to your model's API definition.\n",
    "bugtrack_url": null,
    "license": "Apache License 2.0",
    "summary": "LLM Group Chat Framework: chat with multiple LLMs at the same time",
    "version": "0.0.6",
    "project_urls": {
        "Homepage": "https://github.com/internlm/openaoe"
    },
    "split_keywords": [
        "openaoe",
        "open-aoe"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "8335d50e9f430b3274f3434a6b5a57b3f260889bdcadd5a4e76718475ec23a9f",
                "md5": "801bfc233f9b7865733ec75c26aa29d9",
                "sha256": "4bb2e7fcdfac11c6a8ebd8d69ec533d58ae70f3e79844beb4337eeb51be8fe5e"
            },
            "downloads": -1,
            "filename": "openaoe-0.0.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "801bfc233f9b7865733ec75c26aa29d9",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 4692096,
            "upload_time": "2024-03-05T13:39:45",
            "upload_time_iso_8601": "2024-03-05T13:39:45.873928Z",
            "url": "https://files.pythonhosted.org/packages/83/35/d50e9f430b3274f3434a6b5a57b3f260889bdcadd5a4e76718475ec23a9f/openaoe-0.0.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "80e4ffb871615ce71bf0a3f712e83a0e751930f7472049739acbedbe0fbc3a35",
                "md5": "94f874487538ecb475d3d58f13d795ab",
                "sha256": "31301d8cbefe6c2959e73f9a81ef50bf3662305fe650737367ab9e375990a064"
            },
            "downloads": -1,
            "filename": "openaoe-0.0.6.tar.gz",
            "has_sig": false,
            "md5_digest": "94f874487538ecb475d3d58f13d795ab",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 4658440,
            "upload_time": "2024-03-05T13:39:50",
            "upload_time_iso_8601": "2024-03-05T13:39:50.598855Z",
            "url": "https://files.pythonhosted.org/packages/80/e4/ffb871615ce71bf0a3f712e83a0e751930f7472049739acbedbe0fbc3a35/openaoe-0.0.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-03-05 13:39:50",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "internlm",
    "github_project": "openaoe",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "openaoe"
}
        
Elapsed time: 0.82592s