# Cardinal
[![GitHub Code License](https://img.shields.io/github/license/the-seeds/cardinal)](LICENSE)
[![PyPI](https://img.shields.io/pypi/v/pycardinal)](https://pypi.org/project/pycardinal/)
## Usage
Create a `.env` file in the root directory:
```
.
├── src
└── .env
```
```
# imitater or openai
OPENAI_BASE_URL=http://192.168.0.1:8010/v1
OPENAI_API_KEY=0
# models
EMBED_MODEL=text-embedding-ada-002
CHAT_MODEL=gpt-3.5-turbo-1106
TOKENIZER_PATH=Qwen/Qwen-14B-Chat
# text splitter
CHUNK_SIZE=200
CHUNK_OVERLAP=30
NUM_CPU_CORE=16
# storages
STORAGE=redis
REDIS_URI=redis://192.168.0.1:6379
ELASTICSEARCH_URI=http://192.168.0.1:9001
# vectorstore
VECTORSTORE=chroma
CHROMA_PATH=./chroma
MILVUS_URI=http://192.168.0.1:19530
MILVUS_TOKEN=0
ADMIN_USER_ID=123456789
# KBQA
DEFAULT_SYSTEM_PROMPT=You are a helpful assistant.
EMBED_INSTRUCTION=为这个句子生成表示以用于检索相关文章:
PLAIN_TEMPLATE=你是ChatGPT,由OpenAI开发的大语言模型,针对问题作出详细和有帮助的解答。\n\n问题:{question}
KBQA_TEMPLATE=你是ChatGPT,由OpenAI开发的大语言模型,根据已知信息,针对问题作出详细和有帮助的解答。\n\n已知信息:{context}\n\n问题:{question}
KBQA_THRESHOLD=1.0
KBQA_TEMPERATURE=0.95
# WordGraph
KEYWORD_TEMPLATE=提取文章里的实体概念,数量不超过三个,用分号“;”分割。文章内容:{answer}
KEYWORD_TEMPERATURE=0
# service
SERVICE_PORT=8020
# tests
SERVER_URL=http://192.168.0.1:8020
```
## Build Database
```bash
python src/launcher.py --action build
```
## Launch Server
```bash
python src/launcher.py --action launch
```
## View Collected Messages
```bash
python src/launcher.py --action view
```
Raw data
{
"_id": null,
"home_page": "https://github.com/the-seeds/cardinal",
"name": "pycardinal",
"maintainer": "",
"docs_url": null,
"requires_python": ">=3.9.0",
"maintainer_email": "",
"keywords": "LLM",
"author": "the-seeds",
"author_email": "",
"download_url": "https://files.pythonhosted.org/packages/7f/fb/8540ca83a858ffbcf7cd2dba8e29f8442cc1b1d8f6779468242239d2a6b4/pycardinal-0.2.0.tar.gz",
"platform": null,
"description": "# Cardinal\n\n[![GitHub Code License](https://img.shields.io/github/license/the-seeds/cardinal)](LICENSE)\n[![PyPI](https://img.shields.io/pypi/v/pycardinal)](https://pypi.org/project/pycardinal/)\n\n## Usage\n\nCreate a `.env` file in the root directory:\n\n```\n.\n\u251c\u2500\u2500 src\n\u2514\u2500\u2500 .env\n```\n\n```\n# imitater or openai\nOPENAI_BASE_URL=http://192.168.0.1:8010/v1\nOPENAI_API_KEY=0\n\n# models\nEMBED_MODEL=text-embedding-ada-002\nCHAT_MODEL=gpt-3.5-turbo-1106\nTOKENIZER_PATH=Qwen/Qwen-14B-Chat\n\n# text splitter\nCHUNK_SIZE=200\nCHUNK_OVERLAP=30\nNUM_CPU_CORE=16\n\n# storages\nSTORAGE=redis\nREDIS_URI=redis://192.168.0.1:6379\nELASTICSEARCH_URI=http://192.168.0.1:9001\n\n# vectorstore\nVECTORSTORE=chroma\nCHROMA_PATH=./chroma\nMILVUS_URI=http://192.168.0.1:19530\nMILVUS_TOKEN=0\nADMIN_USER_ID=123456789\n\n# KBQA\nDEFAULT_SYSTEM_PROMPT=You are a helpful assistant.\nEMBED_INSTRUCTION=\u4e3a\u8fd9\u4e2a\u53e5\u5b50\u751f\u6210\u8868\u793a\u4ee5\u7528\u4e8e\u68c0\u7d22\u76f8\u5173\u6587\u7ae0\uff1a\nPLAIN_TEMPLATE=\u4f60\u662fChatGPT\uff0c\u7531OpenAI\u5f00\u53d1\u7684\u5927\u8bed\u8a00\u6a21\u578b\uff0c\u9488\u5bf9\u95ee\u9898\u4f5c\u51fa\u8be6\u7ec6\u548c\u6709\u5e2e\u52a9\u7684\u89e3\u7b54\u3002\\n\\n\u95ee\u9898\uff1a{question}\nKBQA_TEMPLATE=\u4f60\u662fChatGPT\uff0c\u7531OpenAI\u5f00\u53d1\u7684\u5927\u8bed\u8a00\u6a21\u578b\uff0c\u6839\u636e\u5df2\u77e5\u4fe1\u606f\uff0c\u9488\u5bf9\u95ee\u9898\u4f5c\u51fa\u8be6\u7ec6\u548c\u6709\u5e2e\u52a9\u7684\u89e3\u7b54\u3002\\n\\n\u5df2\u77e5\u4fe1\u606f\uff1a{context}\\n\\n\u95ee\u9898\uff1a{question}\nKBQA_THRESHOLD=1.0\nKBQA_TEMPERATURE=0.95\n\n# WordGraph\nKEYWORD_TEMPLATE=\u63d0\u53d6\u6587\u7ae0\u91cc\u7684\u5b9e\u4f53\u6982\u5ff5\uff0c\u6570\u91cf\u4e0d\u8d85\u8fc7\u4e09\u4e2a\uff0c\u7528\u5206\u53f7\u201c\uff1b\u201d\u5206\u5272\u3002\u6587\u7ae0\u5185\u5bb9\uff1a{answer}\nKEYWORD_TEMPERATURE=0\n\n# service\nSERVICE_PORT=8020\n\n# tests\nSERVER_URL=http://192.168.0.1:8020\n```\n\n## Build Database\n\n```bash\npython src/launcher.py --action build\n```\n\n## Launch Server\n\n```bash\npython src/launcher.py --action launch\n```\n\n## View Collected Messages\n\n```bash\npython src/launcher.py --action view\n```\n",
"bugtrack_url": null,
"license": "Apache 2.0 License",
"summary": "",
"version": "0.2.0",
"project_urls": {
"Homepage": "https://github.com/the-seeds/cardinal"
},
"split_keywords": [
"llm"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "91698516510946bb6d3cecc918e5381f4f36846f2ebdfd1d55a8c24ed432a43e",
"md5": "413a46221cf62f76e3573e92133699c9",
"sha256": "c6507463bf6220f1605084a557a2b3182a424c0556f66d44a84c17387d6b55e4"
},
"downloads": -1,
"filename": "pycardinal-0.2.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "413a46221cf62f76e3573e92133699c9",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9.0",
"size": 31831,
"upload_time": "2024-02-26T22:49:15",
"upload_time_iso_8601": "2024-02-26T22:49:15.673864Z",
"url": "https://files.pythonhosted.org/packages/91/69/8516510946bb6d3cecc918e5381f4f36846f2ebdfd1d55a8c24ed432a43e/pycardinal-0.2.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7ffb8540ca83a858ffbcf7cd2dba8e29f8442cc1b1d8f6779468242239d2a6b4",
"md5": "01c34beca89453ccc5cc63b6cdee1569",
"sha256": "17f02ac611b0419ebbc61575ba9f66ff6e25ddf2d2feef6905ab78ca8240c3aa"
},
"downloads": -1,
"filename": "pycardinal-0.2.0.tar.gz",
"has_sig": false,
"md5_digest": "01c34beca89453ccc5cc63b6cdee1569",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9.0",
"size": 22840,
"upload_time": "2024-02-26T22:49:17",
"upload_time_iso_8601": "2024-02-26T22:49:17.618059Z",
"url": "https://files.pythonhosted.org/packages/7f/fb/8540ca83a858ffbcf7cd2dba8e29f8442cc1b1d8f6779468242239d2a6b4/pycardinal-0.2.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-26 22:49:17",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "the-seeds",
"github_project": "cardinal",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "openai",
"specs": [
[
">=",
"1.5.0"
]
]
},
{
"name": "pydantic",
"specs": [
[
">=",
"2.0.0"
]
]
},
{
"name": "python-dotenv",
"specs": []
}
],
"lcname": "pycardinal"
}