# easy_rag_llm
## CAUTION
- easy-rag-llm==1.0.* version is testing version. These versions are usually invalid.
## ๐ฐ๐ท ์๊ฐ
- easy_rag_llm๋ OpenAI ๋ฐ DeepSeek ๋ชจ๋ธ์ ์ง์ํ๋ ๊ฐ๋จํ RAG(์ ๋ณด ๊ฒ์ ๋ฐ ์์ฑ) ๊ธฐ๋ฐ ์๋น์ค๋ฅผ ์ ๊ณตํฉ๋๋ค. ๊ฐ๋จํ๊ฒ RAG LLM์ ์๋น์ค์ ํตํฉ์ํฌ ์ ์๋๋ก ๋ง๋ค์ด์ก์ต๋๋ค.
- (2025.01.16 ๊ธฐ์ค/ v1.1.0) ํ์ต๊ฐ๋ฅํ ์๋ฃ ํฌ๋งท์ PDF์
๋๋ค.
## ๐บ๐ธ Introduction
- easy_rag_llm is a lightweight RAG-based service that supports both OpenAI and DeepSeek models.
It is designed to seamlessly integrate RAG-based LLM functionalities into your service.
- As of 2025-01-15 (v1.1.0), the supported resource format for training is PDF.
## Usage
#### Install (https://pypi.org/project/easy-rag-llm/)
```bash
pip install easy_rag_llm
```
#### How to integrate to your service?
```python
from easy_rag import RagService
rs = RagService(
embedding_model="text-embedding-3-small", #Fixed to OpenAI model
response_model="deepseek-chat", # Options: "openai" or "deepseek-chat"
open_api_key="your_openai_api_key_here",
deepseek_api_key="your_deepseek_api_key_here",
deepseek_base_url="https://api.deepseek.com",
)
rs2 = RagService( # this is example for openai chat model
embedding_model="text-embedding-3-small",
response_model="gpt-3.5-turbo",
open_api_key="your_openai_api_key_here",
)
# Learn from all files under ./rscFiles
resource = rs.rsc("./rscFiles", force_update=False, max_workers=5) # default workers are 10.
query = "Explain what is taught in the third week's lecture."
response, top_evidence = rs.generate_response(resource, query, evidence_num=5) # default evidence_num is 3.
print(response)
```
### ๐ฐ๐ท ์๋ด.
- pdf ์ ๋ชฉ์ ๋ช
ํํ๊ฒ ์ ์ด์ฃผ์ธ์. ๋ฉํ๋ฐ์ดํฐ์๋ pdf์ ๋ชฉ์ด ์ถ์ถ๋์ด ๋ค์ด๊ฐ๋ฉฐ, ๋ต๋ณ ๊ทผ๊ฑฐ๋ฅผ ์ถ๋ ฅํ ๋ ์ ์ฉํ๊ฒ ์ฌ์ฉ๋ ์ ์์ต๋๋ค.
- `rs.rsc("./folder")` ์๋์ `faiss_index.bin`๊ณผ `metadata.json`์ด ์์ฑ๋ฉ๋๋ค. ์ดํ์ ์ด๋ฏธ ๋ง๋ค์ด์ง .bin๊ณผ .json์ผ๋ก ๋ต๋ณ์ ์์ฑํฉ๋๋ค. ๋ง์ฝ ํด๋์ ์๋ก์ด ํ์ผ์ ์ถ๊ฐํ๊ฑฐ๋ ์ ๊ฑฐํ์ฌ ๋ณ๊ฒฝํ๊ณ ์ถ๋ค๋ฉด `force_update=True`๋ก ์ค์ ํ์ฌ ๊ฐ์ ์
๋ฐ์ดํธ๊ฐ ๊ฐ๋ฅํฉ๋๋ค.
### ๐บ๐ธ Note.
- Ensure that your PDFs have clear titles. Extracted titles from the PDF metadata are used during training and for generating evidence-based responses.
- Running `rs.rsc("./folder")` generates `faiss_index.bin` and `metadata.json` files. Subsequently, the system uses the existing .bin and .json files to generate responses. If you want to reflect changes by adding or removing files in the folder, you can enable forced updates by setting `force_update=True`.
### release version.
- 1.0.12 : Supported. However, the embedding model and chat model are fixed to OpenAI's text-embedding-3-small and deepseek-chat, respectively. Fixed at threadpool worker=10, which may cause errors in certain environments.
- 1.1.0 : LTS version.
### TODO
- ํด๋๊ธฐ๋ฐ ์ ๋ฆฌ ์ง์. ./rscFiles ์
๋ ฅํ์ผ๋ฉด rscFilesIndex ์์ฑํ๊ณ
๊ทธ ์๋๋ก ์ธ๋ฑ์ค ์ ๋ฆฌ.
index/์๋์ ์์ฑ๋ ์๋ฒ ๋ฉ์ด ์์ผ๋ฉด ๊ทธ๊ฑฐ ์ฐ๋๋ก ํจ.
- Replace threadPool to asyncio (v1.2.* ~)
- L2 ๊ธฐ๋ฐ ๋ฒกํฐ๊ฒ์์ธ HNSW ์ง์. (์ฒด๊ฐ์ฑ๋ฅ ๋น๊ต) (v1.3.0~)
- ์
๋ ฅํฌ๋งท ๋ค์ํ. pdf์ธ ์ง์. (v1.4.* ~)
### What can you do with this?
https://github.com/Aiden-Kwak/ClimateJudgeLLM
### Author Information
- ๊ณฝ๋ณํ (https://github.com/Aiden-Kwak)
Raw data
{
"_id": null,
"home_page": "https://github.com/Aiden-Kwak/easy_rag",
"name": "easy-rag-llm",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.8",
"maintainer_email": null,
"keywords": null,
"author": "Aiden-Kwak",
"author_email": "duckracoon@gist.ac.kr",
"download_url": "https://files.pythonhosted.org/packages/ee/53/7bf64b775003b048c264e85fed28b6d384b86e7423f44e38d8c37465c90f/easy_rag_llm-1.1.1.tar.gz",
"platform": null,
"description": "# easy_rag_llm\n\n## CAUTION\n- easy-rag-llm==1.0.* version is testing version. These versions are usually invalid.\n\n## \ud83c\uddf0\ud83c\uddf7 \uc18c\uac1c\n- easy_rag_llm\ub294 OpenAI \ubc0f DeepSeek \ubaa8\ub378\uc744 \uc9c0\uc6d0\ud558\ub294 \uac04\ub2e8\ud55c RAG(\uc815\ubcf4 \uac80\uc0c9 \ubc0f \uc0dd\uc131) \uae30\ubc18 \uc11c\ube44\uc2a4\ub97c \uc81c\uacf5\ud569\ub2c8\ub2e4. \uac04\ub2e8\ud558\uac8c RAG LLM\uc744 \uc11c\ube44\uc2a4\uc5d0 \ud1b5\ud569\uc2dc\ud0ac \uc218 \uc788\ub3c4\ub85d \ub9cc\ub4e4\uc5b4\uc84c\uc2b5\ub2c8\ub2e4.\n- (2025.01.16 \uae30\uc900/ v1.1.0) \ud559\uc2b5\uac00\ub2a5\ud55c \uc790\ub8cc \ud3ec\ub9f7\uc740 PDF\uc785\ub2c8\ub2e4.\n\n## \ud83c\uddfa\ud83c\uddf8 Introduction\n- easy_rag_llm is a lightweight RAG-based service that supports both OpenAI and DeepSeek models.\nIt is designed to seamlessly integrate RAG-based LLM functionalities into your service.\n- As of 2025-01-15 (v1.1.0), the supported resource format for training is PDF.\n\n## Usage\n#### Install (https://pypi.org/project/easy-rag-llm/)\n```bash\npip install easy_rag_llm\n```\n\n#### How to integrate to your service?\n```python\nfrom easy_rag import RagService\n\nrs = RagService(\n embedding_model=\"text-embedding-3-small\", #Fixed to OpenAI model\n response_model=\"deepseek-chat\", # Options: \"openai\" or \"deepseek-chat\"\n open_api_key=\"your_openai_api_key_here\",\n deepseek_api_key=\"your_deepseek_api_key_here\",\n deepseek_base_url=\"https://api.deepseek.com\",\n)\n\nrs2 = RagService( # this is example for openai chat model\n embedding_model=\"text-embedding-3-small\",\n response_model=\"gpt-3.5-turbo\",\n open_api_key=\"your_openai_api_key_here\",\n)\n\n# Learn from all files under ./rscFiles\nresource = rs.rsc(\"./rscFiles\", force_update=False, max_workers=5) # default workers are 10.\n\nquery = \"Explain what is taught in the third week's lecture.\"\nresponse, top_evidence = rs.generate_response(resource, query, evidence_num=5) # default evidence_num is 3.\n\nprint(response)\n```\n\n### \ud83c\uddf0\ud83c\uddf7 \uc548\ub0b4.\n- pdf \uc81c\ubaa9\uc744 \uba85\ud655\ud558\uac8c \uc801\uc5b4\uc8fc\uc138\uc694. \uba54\ud0c0\ub370\uc774\ud130\uc5d0\ub294 pdf\uc81c\ubaa9\uc774 \ucd94\ucd9c\ub418\uc5b4 \ub4e4\uc5b4\uac00\uba70, \ub2f5\ubcc0 \uadfc\uac70\ub97c \ucd9c\ub825\ud560\ub54c \uc720\uc6a9\ud558\uac8c \uc0ac\uc6a9\ub420 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\n- `rs.rsc(\"./folder\")` \uc791\ub3d9\uc2dc `faiss_index.bin`\uacfc `metadata.json`\uc774 \uc0dd\uc131\ub429\ub2c8\ub2e4. \uc774\ud6c4\uc5d4 \uc774\ubbf8 \ub9cc\ub4e4\uc5b4\uc9c4 .bin\uacfc .json\uc73c\ub85c \ub2f5\ubcc0\uc744 \uc0dd\uc131\ud569\ub2c8\ub2e4. \ub9cc\uc57d \ud3f4\ub354\uc5d0 \uc0c8\ub85c\uc6b4 \ud30c\uc77c\uc744 \ucd94\uac00\ud558\uac70\ub098 \uc81c\uac70\ud558\uc5ec \ubcc0\uacbd\ud558\uace0 \uc2f6\ub2e4\uba74 `force_update=True`\ub85c \uc124\uc815\ud558\uc5ec \uac15\uc81c\uc5c5\ub370\uc774\ud2b8\uac00 \uac00\ub2a5\ud569\ub2c8\ub2e4.\n\n### \ud83c\uddfa\ud83c\uddf8 Note.\n- Ensure that your PDFs have clear titles. Extracted titles from the PDF metadata are used during training and for generating evidence-based responses.\n- Running `rs.rsc(\"./folder\")` generates `faiss_index.bin` and `metadata.json` files. Subsequently, the system uses the existing .bin and .json files to generate responses. If you want to reflect changes by adding or removing files in the folder, you can enable forced updates by setting `force_update=True`.\n\n### release version.\n- 1.0.12 : Supported. However, the embedding model and chat model are fixed to OpenAI's text-embedding-3-small and deepseek-chat, respectively. Fixed at threadpool worker=10, which may cause errors in certain environments.\n- 1.1.0 : LTS version.\n\n### TODO\n- \ud3f4\ub354\uae30\ubc18 \uc815\ub9ac \uc9c0\uc6d0. ./rscFiles \uc785\ub825\ud588\uc73c\uba74 rscFilesIndex \uc0dd\uc131\ud558\uace0\n\uadf8 \uc544\ub798\ub85c \uc778\ub371\uc2a4 \uc815\ub9ac.\nindex/\uc544\ub798\uc5d0 \uc0dd\uc131\ub41c \uc784\ubca0\ub529\uc774 \uc788\uc73c\uba74 \uadf8\uac70 \uc4f0\ub3c4\ub85d \ud568.\n- Replace threadPool to asyncio (v1.2.* ~)\n- L2 \uae30\ubc18 \ubca1\ud130\uac80\uc0c9\uc678 HNSW \uc9c0\uc6d0. (\uccb4\uac10\uc131\ub2a5 \ube44\uad50) (v1.3.0~)\n- \uc785\ub825\ud3ec\ub9f7 \ub2e4\uc591\ud654. pdf\uc678 \uc9c0\uc6d0. (v1.4.* ~)\n\n\n### What can you do with this?\nhttps://github.com/Aiden-Kwak/ClimateJudgeLLM\n\n\n\n### Author Information\n- \uacfd\ubcd1\ud601 (https://github.com/Aiden-Kwak)\n",
"bugtrack_url": null,
"license": null,
"summary": "Easily implement RAG workflows with pre-built modules.",
"version": "1.1.1",
"project_urls": {
"Homepage": "https://github.com/Aiden-Kwak/easy_rag"
},
"split_keywords": [],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "92269e9cda6abc86cca74f90275e5546637e951b8a4154a769fc257285dd9c98",
"md5": "6337fbd42525afedc08f158e58f42db0",
"sha256": "61784fc8f8f78313e17bed3923f8c02a7100383594830373a79c3260821600af"
},
"downloads": -1,
"filename": "easy_rag_llm-1.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "6337fbd42525afedc08f158e58f42db0",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8",
"size": 9774,
"upload_time": "2025-01-22T02:39:25",
"upload_time_iso_8601": "2025-01-22T02:39:25.495733Z",
"url": "https://files.pythonhosted.org/packages/92/26/9e9cda6abc86cca74f90275e5546637e951b8a4154a769fc257285dd9c98/easy_rag_llm-1.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "ee537bf64b775003b048c264e85fed28b6d384b86e7423f44e38d8c37465c90f",
"md5": "37be8fb659f07c616f93e29320f8b522",
"sha256": "f489bd5701176f01c5b41dfa5ba3c3afea087e8fe32a47ee9d8b5666553a0dc5"
},
"downloads": -1,
"filename": "easy_rag_llm-1.1.1.tar.gz",
"has_sig": false,
"md5_digest": "37be8fb659f07c616f93e29320f8b522",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8",
"size": 11382,
"upload_time": "2025-01-22T02:39:26",
"upload_time_iso_8601": "2025-01-22T02:39:26.750326Z",
"url": "https://files.pythonhosted.org/packages/ee/53/7bf64b775003b048c264e85fed28b6d384b86e7423f44e38d8c37465c90f/easy_rag_llm-1.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-01-22 02:39:26",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Aiden-Kwak",
"github_project": "easy_rag",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "faiss-cpu",
"specs": [
[
"==",
"1.9.0.post1"
]
]
},
{
"name": "numpy",
"specs": [
[
"==",
"2.2.1"
]
]
},
{
"name": "packaging",
"specs": [
[
"==",
"24.2"
]
]
},
{
"name": "PyPDF2",
"specs": [
[
"==",
"3.0.1"
]
]
},
{
"name": "python-dotenv",
"specs": [
[
"==",
"1.0.1"
]
]
},
{
"name": "tqdm",
"specs": [
[
"==",
"4.67.1"
]
]
}
],
"lcname": "easy-rag-llm"
}