lugacore


Namelugacore JSON
Version 0.1.0a1 PyPI version JSON
download
home_pageNone
SummaryA Naive RAG SDK using Gemini for embeddings, Qdrant for retrieval, and document loaders for building RAG piplines from scratch.
upload_time2025-10-30 15:36:40
maintainerNone
docs_urlNone
authorNone
requires_python>=3.12
licenseNone
keywords rag retrieval-augmented-generation gemini qdrant sdk ai nlp
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            <h1 align="center" >LugaCore</h1>

[![License:MIT](https://img.shields.io/badge/license-MIT-green.svg?style=flat-default)](LICENSE) 
![Release Status](https://img.shields.io/badge/Release_Status-3--alpha-orange?style=flat-default)
![LLM](https://img.shields.io/badge/LLM-Gemini-8E75B2?style=flat-defaultc&logo=google&logoColor=white)
![VectorDB](https://img.shields.io/badge/VectorDB-Qdrant-4515B3?style=flat-default&logo=qdrant&logoColor=white)

A lightweight, modular Retrieval-Augmented-Generation (RAG) system built from fundamental python libraries for efficient, verifiable Q&A.

## **📘 Overview**
This project is a RAG framework built without reliance on high-level orchestration libraries. it demonstrates a ground-up implementation of the key RAG components.

## **✨  Features**
**Custom Document Processing**:
* **Self-implemented Text Splitter**: Recursive,semantic text splitting with token awareness logic for optimal chunk boundaries.
* **Custom Document Loader**: Direct .docx parsing via python-docx, preserving paragraphs.

**Intelligent Retrieval**:
* **Hybrid Search Engine**: combines dense semantic and sparse keyword retrieval within Qdrant for flexible query resolution.
* **Qdrant Integration**: Native support for:
  * Vector indexing and collection management
  * Batch upserts and attribute-based payload filtering
  * Hybrid retrieval experimentation (dense + sparse)

**Gemini LLM Integration** 
* **LLM-Driven Generation**: Uses **Gemini** exclusively for answer synthesis, summarization, and context-aware responses.
* **Context injection**: Dynamically augments user queries with retrieved context chunks.
* **Conversation Flow**: Uses [Gemini chat SDK](https://ai.google.dev/gemini-api/docs/text-generation) for multi-turn conversations keeping track of user queries and RAG responses to improve relevance.

**Modular & Transparent**
* 100% open and debuggable - designed for **learning**, **research**, and **low-level RAG experimentation**.


## **🚀  Installation & Quick Start**
Follow these steps to install **LangCore** from PyPI and start using your RAG powered by Gemini.
### 1. Create project directory (if not already created)
```bash
# setup
mkdir my_project
cd my_project
```
### 2. Create and activate Virtual environment

````bash
# create a virtual environment
python -m venv virt

#Activate it (windows)
source virt/scripts/activate
````
### 3. Install LugaCore from PyPI


````bash
 pip install lugacore
````
### 4. Verify Installation

````bash
pip list | grep lugacore
````
### 5. Quick Start Example

````python
from lugacore import LugaCore

#Initialize RAG pipeline
rag = LugaCore(
  qdrant_api_key="YOUR_QDRANT_API_KEY",
  gemini_api_key="YOUR_GEMINI_API_KEY",
  qdrant_url="YOUR_QDRANT_URL"
)

# Ingest documents
rag.load_document(filepath_or_buffer="path/to/your/file.docx")

#Query your knowledge base
response = rag.ask("What is physics?")
print(response)
````
## **🧩  Planed Features**

* Re-ranking.
* Knowledge Graph
* Caching layer
* Multimodal support (images,audio, video).
* Agentic pipelines
* Async & Sync support


            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "lugacore",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.12",
    "maintainer_email": null,
    "keywords": "rag, retrieval-augmented-generation, gemini, qdrant, sdk, ai, nlp",
    "author": null,
    "author_email": "Acram Kagimu <acramkagimu@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/0e/3b/f0e70c8d2d4e3894e47bb1328fd8b5aa49109f2a66edc887794802502987/lugacore-0.1.0a1.tar.gz",
    "platform": null,
    "description": "<h1 align=\"center\" >LugaCore</h1>\r\n\r\n[![License:MIT](https://img.shields.io/badge/license-MIT-green.svg?style=flat-default)](LICENSE) \r\n![Release Status](https://img.shields.io/badge/Release_Status-3--alpha-orange?style=flat-default)\r\n![LLM](https://img.shields.io/badge/LLM-Gemini-8E75B2?style=flat-defaultc&logo=google&logoColor=white)\r\n![VectorDB](https://img.shields.io/badge/VectorDB-Qdrant-4515B3?style=flat-default&logo=qdrant&logoColor=white)\r\n\r\nA lightweight, modular Retrieval-Augmented-Generation (RAG) system built from fundamental python libraries for efficient, verifiable Q&A.\r\n\r\n## **\ud83d\udcd8 Overview**\r\nThis project is a RAG framework built without reliance on high-level orchestration libraries. it demonstrates a ground-up implementation of the key RAG components.\r\n\r\n## **\u2728  Features**\r\n**Custom Document Processing**:\r\n* **Self-implemented Text Splitter**: Recursive,semantic text splitting with token awareness logic for optimal chunk boundaries.\r\n* **Custom Document Loader**: Direct .docx parsing via python-docx, preserving paragraphs.\r\n\r\n**Intelligent Retrieval**:\r\n* **Hybrid Search Engine**: combines dense semantic and sparse keyword retrieval within Qdrant for flexible query resolution.\r\n* **Qdrant Integration**: Native support for:\r\n  * Vector indexing and collection management\r\n  * Batch upserts and attribute-based payload filtering\r\n  * Hybrid retrieval experimentation (dense + sparse)\r\n\r\n**Gemini LLM Integration** \r\n* **LLM-Driven Generation**: Uses **Gemini** exclusively for answer synthesis, summarization, and context-aware responses.\r\n* **Context injection**: Dynamically augments user queries with retrieved context chunks.\r\n* **Conversation Flow**: Uses [Gemini chat SDK](https://ai.google.dev/gemini-api/docs/text-generation) for multi-turn conversations keeping track of user queries and RAG responses to improve relevance.\r\n\r\n**Modular & Transparent**\r\n* 100% open and debuggable - designed for **learning**, **research**, and **low-level RAG experimentation**.\r\n\r\n\r\n## **\ud83d\ude80  Installation & Quick Start**\r\nFollow these steps to install **LangCore** from PyPI and start using your RAG powered by Gemini.\r\n### 1. Create project directory (if not already created)\r\n```bash\r\n# setup\r\nmkdir my_project\r\ncd my_project\r\n```\r\n### 2. Create and activate Virtual environment\r\n\r\n````bash\r\n# create a virtual environment\r\npython -m venv virt\r\n\r\n#Activate it (windows)\r\nsource virt/scripts/activate\r\n````\r\n### 3. Install LugaCore from PyPI\r\n\r\n\r\n````bash\r\n pip install lugacore\r\n````\r\n### 4. Verify Installation\r\n\r\n````bash\r\npip list | grep lugacore\r\n````\r\n### 5. Quick Start Example\r\n\r\n````python\r\nfrom lugacore import LugaCore\r\n\r\n#Initialize RAG pipeline\r\nrag = LugaCore(\r\n  qdrant_api_key=\"YOUR_QDRANT_API_KEY\",\r\n  gemini_api_key=\"YOUR_GEMINI_API_KEY\",\r\n  qdrant_url=\"YOUR_QDRANT_URL\"\r\n)\r\n\r\n# Ingest documents\r\nrag.load_document(filepath_or_buffer=\"path/to/your/file.docx\")\r\n\r\n#Query your knowledge base\r\nresponse = rag.ask(\"What is physics?\")\r\nprint(response)\r\n````\r\n## **\ud83e\udde9  Planed Features**\r\n\r\n* Re-ranking.\r\n* Knowledge Graph\r\n* Caching layer\r\n* Multimodal support (images,audio, video).\r\n* Agentic pipelines\r\n* Async & Sync support\r\n\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A Naive RAG SDK using Gemini for embeddings, Qdrant for retrieval, and document loaders for building RAG piplines from scratch.",
    "version": "0.1.0a1",
    "project_urls": {
        "Homepage": "https://github.com/Acram123-arch/lugacore",
        "Issues": "https://github.com/Acram123-arch/lugacore/issues"
    },
    "split_keywords": [
        "rag",
        " retrieval-augmented-generation",
        " gemini",
        " qdrant",
        " sdk",
        " ai",
        " nlp"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "dc6492875c63f75648eee7c229d45b2e6edb320f9b79e91bb7aa43dda7cf9bad",
                "md5": "a0c4b8666610e3848c4bd5e7c6ff6076",
                "sha256": "7d3a801242d1627e41958ece36358912f21ecbc74b8f65e40e18b7284f65a7eb"
            },
            "downloads": -1,
            "filename": "lugacore-0.1.0a1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a0c4b8666610e3848c4bd5e7c6ff6076",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.12",
            "size": 13687,
            "upload_time": "2025-10-30T15:36:39",
            "upload_time_iso_8601": "2025-10-30T15:36:39.314432Z",
            "url": "https://files.pythonhosted.org/packages/dc/64/92875c63f75648eee7c229d45b2e6edb320f9b79e91bb7aa43dda7cf9bad/lugacore-0.1.0a1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "0e3bf0e70c8d2d4e3894e47bb1328fd8b5aa49109f2a66edc887794802502987",
                "md5": "7d8954eb2ce3ff8a6a47a3de0e95bf8f",
                "sha256": "9b9d95b736170471f4bf652a6e6abc63df42bb1df24ff4635a9f74b4e52bebaa"
            },
            "downloads": -1,
            "filename": "lugacore-0.1.0a1.tar.gz",
            "has_sig": false,
            "md5_digest": "7d8954eb2ce3ff8a6a47a3de0e95bf8f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.12",
            "size": 12884,
            "upload_time": "2025-10-30T15:36:40",
            "upload_time_iso_8601": "2025-10-30T15:36:40.795454Z",
            "url": "https://files.pythonhosted.org/packages/0e/3b/f0e70c8d2d4e3894e47bb1328fd8b5aa49109f2a66edc887794802502987/lugacore-0.1.0a1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-10-30 15:36:40",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Acram123-arch",
    "github_project": "lugacore",
    "github_not_found": true,
    "lcname": "lugacore"
}
        
Elapsed time: 0.98641s