fschat-FlagEmbedding-worker


Namefschat-FlagEmbedding-worker JSON
Version 0.1.1 PyPI version JSON
download
home_pageNone
SummaryFlagEmbedding model worker for fastchat.
upload_time2024-06-09 12:19:41
maintainerJiang XiaoKai
docs_urlNone
authorJiang XiaoKai
requires_pythonNone
licenseMIT
keywords fastchat flagembedding model_worker
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # fs-FlagEmbedding-worker

FlagEmbedding model worker for fastchat.

## Install

```shell
pip install fschat-FlagEmbedding-worker
```

## Usage

### 启动embedding模型工作者

```shell
python -m fschat_flagembedding_worker.serve.flagmodel_worker --controller-address http://localhost:21001 --model-path=/models/BAAI/bge-large-zh-v1.5 --port=22002 --worker-address=http://localhost:22002 --debug
```

### 启动reranker模型工作者

```shell
python -m fschat_flagembedding_worker.serve.flagreranker_worker --controller-address http://localhost:21001 --model-path=/models/BAAI/bge-reranker-large --port=22003 --worker-address=http://localhost:22003 --debug
```

### 使用embedding模型

```python
import os
from langchain_openai.embeddings import OpenAIEmbeddings
from openai.resources.embeddings import Embeddings

API_SECRET_KEY = "QufTZCA5Y1zIaC3GxNBUn2Q0vW1NITKm"
BASE_URL = "http://localhost:8000/v1"

EMBEDDING_MODEL_NAME = "bge-large-zh-v1.5"
RERANKER_MODEL_NAME = "bge-reranker-large"

os.environ["OPENAI_API_KEY"] = API_SECRET_KEY
os.environ["OPENAI_API_BASE"] = BASE_URL

embd = OpenAIEmbeddings(
    model=EMBEDDING_MODEL_NAME,
    check_embedding_ctx_length=False,
    tiktoken_enabled=False,
)
r1 = embd.embed_query("hello world")
print(r1)
```

*输出:*

```text
[-0.02532958984375, -0.0016546249389648438, ..., 0.018463134765625, 0.0161895751953125]
```


### 使用reranker模型

- openapi并没有提供reranker响应的接口,这里我们将使用embedding接口来获取reranker的响应。
- 但对输入参数进行了约定,要求格式如下:[query, passage1, query, passage2, ...]。
- 特别需要注意的是,在启动openai_api服务时,需要添加额外的环境变量:FASTCHAT_WORKER_API_EMBEDDING_BATCH_SIZE。

*openai api server启动案例:*

```shell
#!/bin/bash
export FASTCHAT_WORKER_API_EMBEDDING_BATCH_SIZE=128
/opt/conda/bin/python -m fastchat.serve.openai_api_server --host 0.0.0.0 -api-keys MRpta0LlV0SrxpYBgbecyyF47Pvt263O
```

*reranker模型使用案例:*

```python
import os
from langchain_openai.embeddings import OpenAIEmbeddings
from openai.resources.embeddings import Embeddings

API_SECRET_KEY = "QufTZCA5Y1zIaC3GxNBUn2Q0vW1NITKm"
BASE_URL = "http://localhost:8000/v1"

EMBEDDING_MODEL_NAME = "bge-large-zh-v1.5"
RERANKER_MODEL_NAME = "bge-reranker-large"

os.environ["OPENAI_API_KEY"] = API_SECRET_KEY
os.environ["OPENAI_API_BASE"] = BASE_URL

reranker = OpenAIEmbeddings(
    model=RERANKER_MODEL_NAME,
    check_embedding_ctx_length=False,
    tiktoken_enabled=False,
)


query = "hello"
passages = ["hi", "world", "yes", "how are you?"]

messages = []
for passage in passages:
    messages += [query, passage]

r1 = reranker.embed_query(messages)
print(r1)
```

*输出:*

```text
[5.96577262878418, -3.2970359325408936, 0.9453534483909607, 0.5078737735748291]
```

##

## Releases

### v0.1.1

- First release.



            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "fschat-FlagEmbedding-worker",
    "maintainer": "Jiang XiaoKai",
    "docs_url": null,
    "requires_python": null,
    "maintainer_email": null,
    "keywords": "fastchat, FlagEmbedding, model_worker",
    "author": "Jiang XiaoKai",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/43/d8/0e5bd19867789e81e786c3ef80dbf499326a03ced65cf063cf2bde0a0571/fschat-FlagEmbedding-worker-0.1.1.tar.gz",
    "platform": null,
    "description": "# fs-FlagEmbedding-worker\n\nFlagEmbedding model worker for fastchat.\n\n## Install\n\n```shell\npip install fschat-FlagEmbedding-worker\n```\n\n## Usage\n\n### \u542f\u52a8embedding\u6a21\u578b\u5de5\u4f5c\u8005\n\n```shell\npython -m fschat_flagembedding_worker.serve.flagmodel_worker --controller-address http://localhost:21001 --model-path=/models/BAAI/bge-large-zh-v1.5 --port=22002 --worker-address=http://localhost:22002 --debug\n```\n\n### \u542f\u52a8reranker\u6a21\u578b\u5de5\u4f5c\u8005\n\n```shell\npython -m fschat_flagembedding_worker.serve.flagreranker_worker --controller-address http://localhost:21001 --model-path=/models/BAAI/bge-reranker-large --port=22003 --worker-address=http://localhost:22003 --debug\n```\n\n### \u4f7f\u7528embedding\u6a21\u578b\n\n```python\nimport os\nfrom langchain_openai.embeddings import OpenAIEmbeddings\nfrom openai.resources.embeddings import Embeddings\n\nAPI_SECRET_KEY = \"QufTZCA5Y1zIaC3GxNBUn2Q0vW1NITKm\"\nBASE_URL = \"http://localhost:8000/v1\"\n\nEMBEDDING_MODEL_NAME = \"bge-large-zh-v1.5\"\nRERANKER_MODEL_NAME = \"bge-reranker-large\"\n\nos.environ[\"OPENAI_API_KEY\"] = API_SECRET_KEY\nos.environ[\"OPENAI_API_BASE\"] = BASE_URL\n\nembd = OpenAIEmbeddings(\n    model=EMBEDDING_MODEL_NAME,\n    check_embedding_ctx_length=False,\n    tiktoken_enabled=False,\n)\nr1 = embd.embed_query(\"hello world\")\nprint(r1)\n```\n\n*\u8f93\u51fa\uff1a*\n\n```text\n[-0.02532958984375, -0.0016546249389648438, ..., 0.018463134765625, 0.0161895751953125]\n```\n\n\n### \u4f7f\u7528reranker\u6a21\u578b\n\n- openapi\u5e76\u6ca1\u6709\u63d0\u4f9breranker\u54cd\u5e94\u7684\u63a5\u53e3\uff0c\u8fd9\u91cc\u6211\u4eec\u5c06\u4f7f\u7528embedding\u63a5\u53e3\u6765\u83b7\u53d6reranker\u7684\u54cd\u5e94\u3002\n- \u4f46\u5bf9\u8f93\u5165\u53c2\u6570\u8fdb\u884c\u4e86\u7ea6\u5b9a\uff0c\u8981\u6c42\u683c\u5f0f\u5982\u4e0b\uff1a[query, passage1, query, passage2, ...]\u3002\n- \u7279\u522b\u9700\u8981\u6ce8\u610f\u7684\u662f\uff0c\u5728\u542f\u52a8openai_api\u670d\u52a1\u65f6\uff0c\u9700\u8981\u6dfb\u52a0\u989d\u5916\u7684\u73af\u5883\u53d8\u91cf\uff1aFASTCHAT_WORKER_API_EMBEDDING_BATCH_SIZE\u3002\n\n*openai api server\u542f\u52a8\u6848\u4f8b\uff1a*\n\n```shell\n#!/bin/bash\nexport FASTCHAT_WORKER_API_EMBEDDING_BATCH_SIZE=128\n/opt/conda/bin/python -m fastchat.serve.openai_api_server --host 0.0.0.0 -api-keys MRpta0LlV0SrxpYBgbecyyF47Pvt263O\n```\n\n*reranker\u6a21\u578b\u4f7f\u7528\u6848\u4f8b\uff1a*\n\n```python\nimport os\nfrom langchain_openai.embeddings import OpenAIEmbeddings\nfrom openai.resources.embeddings import Embeddings\n\nAPI_SECRET_KEY = \"QufTZCA5Y1zIaC3GxNBUn2Q0vW1NITKm\"\nBASE_URL = \"http://localhost:8000/v1\"\n\nEMBEDDING_MODEL_NAME = \"bge-large-zh-v1.5\"\nRERANKER_MODEL_NAME = \"bge-reranker-large\"\n\nos.environ[\"OPENAI_API_KEY\"] = API_SECRET_KEY\nos.environ[\"OPENAI_API_BASE\"] = BASE_URL\n\nreranker = OpenAIEmbeddings(\n    model=RERANKER_MODEL_NAME,\n    check_embedding_ctx_length=False,\n    tiktoken_enabled=False,\n)\n\n\nquery = \"hello\"\npassages = [\"hi\", \"world\", \"yes\", \"how are you?\"]\n\nmessages = []\nfor passage in passages:\n    messages += [query, passage]\n\nr1 = reranker.embed_query(messages)\nprint(r1)\n```\n\n*\u8f93\u51fa\uff1a*\n\n```text\n[5.96577262878418, -3.2970359325408936, 0.9453534483909607, 0.5078737735748291]\n```\n\n##\n\n## Releases\n\n### v0.1.1\n\n- First release.\n\n\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "FlagEmbedding model worker for fastchat.",
    "version": "0.1.1",
    "project_urls": null,
    "split_keywords": [
        "fastchat",
        " flagembedding",
        " model_worker"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "125a63d387de6f50701c4b1fdb6e4c896ca4065da7567431367765c4ccf281ad",
                "md5": "8c3bdccc82da72f3b4d74076bd2d09e2",
                "sha256": "718e92109734b303a74e01da346d7a301b932fc0e9b54ac23d94ff3f673d1ec9"
            },
            "downloads": -1,
            "filename": "fschat_FlagEmbedding_worker-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "8c3bdccc82da72f3b4d74076bd2d09e2",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": null,
            "size": 8749,
            "upload_time": "2024-06-09T12:19:40",
            "upload_time_iso_8601": "2024-06-09T12:19:40.164800Z",
            "url": "https://files.pythonhosted.org/packages/12/5a/63d387de6f50701c4b1fdb6e4c896ca4065da7567431367765c4ccf281ad/fschat_FlagEmbedding_worker-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "43d80e5bd19867789e81e786c3ef80dbf499326a03ced65cf063cf2bde0a0571",
                "md5": "93eca5dc330e322ebb1a7d8834b2cbce",
                "sha256": "2e3d0563c7ac1ea730e00d96b5a2edfc830bb965126134da6e0eb9377909cd23"
            },
            "downloads": -1,
            "filename": "fschat-FlagEmbedding-worker-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "93eca5dc330e322ebb1a7d8834b2cbce",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": null,
            "size": 6782,
            "upload_time": "2024-06-09T12:19:41",
            "upload_time_iso_8601": "2024-06-09T12:19:41.564594Z",
            "url": "https://files.pythonhosted.org/packages/43/d8/0e5bd19867789e81e786c3ef80dbf499326a03ced65cf063cf2bde0a0571/fschat-FlagEmbedding-worker-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-06-09 12:19:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "fschat-flagembedding-worker"
}
        
Elapsed time: 0.29112s