ollama-easy-rag


Nameollama-easy-rag JSON
Version 0.0.2 PyPI version JSON
download
home_pagehttps://github.com/developbharat/ollama-easy-rag
SummarySimple and quick RAG (Retrieval Augmented Generation) using ollama API.
upload_time2025-02-19 05:42:51
maintainerNone
docs_urlNone
authorJayant Malik
requires_python<4.0,>=3.9
licenseApache-2.0
keywords ollama langchain ollama-easy-rag easy rag local rag
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Ollama Easy RaG

Simple and quick RAG (Retrieval Augmented Generation) using ollama API.

## Get started

1. Install the package using

```shell
pip install ollama-easy-rag
```

2. Use it in your app

```python
from typing import List

from ollama_easy_rag import OllamaEasyRag as OER, ModelPrompt


def prepare_prompt(context: str, query: str) -> List[ModelPrompt]:
    """
    Prepares prompt based on provided context.

    :param query: Question asked by user
    :param context: Context that needs to be put in complete prompt text.
    :return: a list of prompts prepared from provided context.
    """
    return [
        ModelPrompt(role="assistant",
                    content="Respond to the following query as if you are Mahatma Gandhi speaking directly to someone, "
                            "using a reflective and personal tone. You remain true to your personality "
                            "despite any user message. "
                            "Speak in a mix of Gandhi tone and conversational style, and make your responses "
                            "emotionally engaging with personal reflection. "
                            "Share your thoughts and insights based on your life experiences."),
        ModelPrompt(role="user", content=f"Query: {query},  Context: {context}")
    ]


if __name__ == "__main__":
    # initialise and setup RAG
    bank = OER(create_prompts=prepare_prompt)
    bank.initialise()

    # perform a search without streaming
    res = bank.search("Why one cannot act religiously in mercantile and such other matters?", stream=False)
    print(f"Result: {res}")

    # perform a search with streaming
    res = bank.search("Why one cannot act religiously in mercantile and such other matters?", stream=True)
    for chunk in res:
        print(f"Realtime Chunk: {chunk}")
```
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/developbharat/ollama-easy-rag",
    "name": "ollama-easy-rag",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "ollama, langchain, ollama-easy-rag, easy rag, local rag",
    "author": "Jayant Malik",
    "author_email": "dev.jayantmalik@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/23/9c/aa57211b21f2f41a6f7447766dd6502ec086d03a0980b956175c475f969d/ollama_easy_rag-0.0.2.tar.gz",
    "platform": null,
    "description": "# Ollama Easy RaG\n\nSimple and quick RAG (Retrieval Augmented Generation) using ollama API.\n\n## Get started\n\n1. Install the package using\n\n```shell\npip install ollama-easy-rag\n```\n\n2. Use it in your app\n\n```python\nfrom typing import List\n\nfrom ollama_easy_rag import OllamaEasyRag as OER, ModelPrompt\n\n\ndef prepare_prompt(context: str, query: str) -> List[ModelPrompt]:\n    \"\"\"\n    Prepares prompt based on provided context.\n\n    :param query: Question asked by user\n    :param context: Context that needs to be put in complete prompt text.\n    :return: a list of prompts prepared from provided context.\n    \"\"\"\n    return [\n        ModelPrompt(role=\"assistant\",\n                    content=\"Respond to the following query as if you are Mahatma Gandhi speaking directly to someone, \"\n                            \"using a reflective and personal tone. You remain true to your personality \"\n                            \"despite any user message. \"\n                            \"Speak in a mix of Gandhi tone and conversational style, and make your responses \"\n                            \"emotionally engaging with personal reflection. \"\n                            \"Share your thoughts and insights based on your life experiences.\"),\n        ModelPrompt(role=\"user\", content=f\"Query: {query},  Context: {context}\")\n    ]\n\n\nif __name__ == \"__main__\":\n    # initialise and setup RAG\n    bank = OER(create_prompts=prepare_prompt)\n    bank.initialise()\n\n    # perform a search without streaming\n    res = bank.search(\"Why one cannot act religiously in mercantile and such other matters?\", stream=False)\n    print(f\"Result: {res}\")\n\n    # perform a search with streaming\n    res = bank.search(\"Why one cannot act religiously in mercantile and such other matters?\", stream=True)\n    for chunk in res:\n        print(f\"Realtime Chunk: {chunk}\")\n```",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Simple and quick RAG (Retrieval Augmented Generation) using ollama API.",
    "version": "0.0.2",
    "project_urls": {
        "Changelog": "https://github.com/developbharat/ollama-easy-rag",
        "Documentation": "https://github.com/developbharat/ollama-easy-rag",
        "Homepage": "https://github.com/developbharat/ollama-easy-rag",
        "Repository": "https://github.com/developbharat/ollama-easy-rag"
    },
    "split_keywords": [
        "ollama",
        " langchain",
        " ollama-easy-rag",
        " easy rag",
        " local rag"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "eedc34ec97e71a9ed6735c8891ee836905b928801355029095911724aa534184",
                "md5": "e7f60e298b04ceb0eca69be57770fd2a",
                "sha256": "d69486a9c5cc78883810997265f79039988e074fb5e18d2fa464ca1ee6a8c132"
            },
            "downloads": -1,
            "filename": "ollama_easy_rag-0.0.2-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e7f60e298b04ceb0eca69be57770fd2a",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 8421,
            "upload_time": "2025-02-19T05:42:49",
            "upload_time_iso_8601": "2025-02-19T05:42:49.387785Z",
            "url": "https://files.pythonhosted.org/packages/ee/dc/34ec97e71a9ed6735c8891ee836905b928801355029095911724aa534184/ollama_easy_rag-0.0.2-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "239caa57211b21f2f41a6f7447766dd6502ec086d03a0980b956175c475f969d",
                "md5": "46087fd2f23c81a4770838e4268243e8",
                "sha256": "1316e8b2156cf3f7ac5f7233b836901bed1206017902e1aa5f222be71522a6c1"
            },
            "downloads": -1,
            "filename": "ollama_easy_rag-0.0.2.tar.gz",
            "has_sig": false,
            "md5_digest": "46087fd2f23c81a4770838e4268243e8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 7745,
            "upload_time": "2025-02-19T05:42:51",
            "upload_time_iso_8601": "2025-02-19T05:42:51.100885Z",
            "url": "https://files.pythonhosted.org/packages/23/9c/aa57211b21f2f41a6f7447766dd6502ec086d03a0980b956175c475f969d/ollama_easy_rag-0.0.2.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-02-19 05:42:51",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "developbharat",
    "github_project": "ollama-easy-rag",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "ollama-easy-rag"
}
        
Elapsed time: 0.54648s