needle-python


Nameneedle-python JSON
Version 0.4.0 PyPI version JSON
download
home_pageNone
SummaryNeedle client library for Python
upload_time2024-11-04 11:46:29
maintainerNone
docs_urlNone
authorOnur Eken
requires_python<4.0,>=3.8
licenseMIT
keywords needle api retrieval-augmented generation rag information-retrieval artificial intelligence ai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Needle Python Library

[![PyPI - Version](https://img.shields.io/pypi/v/needle-python.svg)](https://pypi.org/project/needle-python)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/needle-python.svg)](https://pypi.org/project/needle-python)

This Python library provides convenient acccess to Needle API. There are various methods and data types which, we believe will help you explore Needle API quickly. There may be some functionality available in REST API earlier than this Python library. In any case, we recommend to take look the the complete [documentation](https://docs.needle-ai.com). Thank you for flying with us. ๐Ÿš€

## Installation

This library requires Python >3.8 and `pip` to use. You don't need the sources unless you want to modify it. Install with:

```
pip install needle-python
```

## Usage โšก๏ธ

To get started, generate an API key for your account in developer settings menu at [Needle](https://needle-ai.com). Note that your key will be valid until you revoke it. Set the following env variable before you run your code:

```
export NEEDLE_API_KEY=<your-api-key>
```

`NeedleClient` reads the API key from the environment by default. If you like to override this behaviour you can pass it in as a parameter. 

### Retrieve context from Needle

```python
from needle.v1 import NeedleClient
from needle.v1.models import FileToAdd


ndl = NeedleClient()
collection = ndl.collections.create(name="Tech Trends")

# add file to collection
files = ndl.collections.files.add(
    collection_id=collection_id,
    files=[
        FileToAdd(
            name="tech-radar-30.pdf",
            url="https://www.thoughtworks.com/content/dam/thoughtworks/documents/radar/2024/04/tr_technology_radar_vol_30_en.pdf",
        )
    ],
)

# wait until indexing is complete
files = ndl.collections.files.list(collection_id)
if not all(f.status == "indexed" for f in files):
    time.sleep(5)
    files = ndl.collections.files.list(collection_id)

# retrieve relevant context
prompt = "What techniques moved into adopt in this volume of technology radar?"
results = ndl.collections.search(collection_id, text=prompt)
```

Needle instantly extracts key points from your files.

### Complete your RAG pipeline

Naturally, to compose a human friendly answer use an LLM provider of your choice. For the demo purposes, we used OpenAI in this example:

```python
from openai import OpenAI

system_messages = [{"role": "system", "content": r.content} for r in results] # results from Needle
user_message = {
    "role": "system",
    "content": f"""
        Only answer the question based on the provided results data. 
        If there is no data in the provided data for the question, do not try to generate an answer.
        This is the question: {prompt}
""",
}

openai_client = OpenAI()
answer = openai_client.chat.completions.create(
    model="gpt-3.5-turbo",
    messages=[
        *system_messages,
        user_message,
    ],
)

print(answer.choices[0].message.content)
# -> Retrieval-Augmented Generation (RAG) is the technique that moved into "Adopt" in this volume of the Technology Radar.
```

This is one basic example of a RAG pipeline you can quicklu implement using Needle and OpenAI. Feel free to engineer more precise prompts and explore other prompting techniques such as chain-of-thoughts (CoT), graph of thoughts (GoT) etc. 

Needle API helps you with hassle-free contextualization however does not limit you to a certain RAG technique. Let us know what you build in our [Discord channel](https://discord.gg/JzJcHgTyZx) :)

## Exceptions ๐Ÿงจ

If a request to Needle API fails, `needle.v1.models.Error` object will be thrown. There you can see a `message` and more details about the error.

## Support ๐Ÿ“ž

If you have questions you can contact us in our [Discord channel](https://discord.gg/JzJcHgTyZx). 

# License

`needle-python` is distributed under the terms of the MIT license.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "needle-python",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.8",
    "maintainer_email": null,
    "keywords": "needle, api, retrieval-augmented generation, rag, information-retrieval, artificial intelligence, ai",
    "author": "Onur Eken",
    "author_email": "m.onureken@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/ed/7e/9df82b2d79f0d4ccc71f93fc1d716877c9bf028b808b5bdf4cc503f3e3b6/needle_python-0.4.0.tar.gz",
    "platform": null,
    "description": "# Needle Python Library\n\n[![PyPI - Version](https://img.shields.io/pypi/v/needle-python.svg)](https://pypi.org/project/needle-python)\n[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/needle-python.svg)](https://pypi.org/project/needle-python)\n\nThis Python library provides convenient acccess to Needle API. There are various methods and data types which, we believe will help you explore Needle API quickly. There may be some functionality available in REST API earlier than this Python library. In any case, we recommend to take look the the complete [documentation](https://docs.needle-ai.com). Thank you for flying with us. \ud83d\ude80\n\n## Installation\n\nThis library requires Python >3.8 and `pip` to use. You don't need the sources unless you want to modify it. Install with:\n\n```\npip install needle-python\n```\n\n## Usage \u26a1\ufe0f\n\nTo get started, generate an API key for your account in developer settings menu at [Needle](https://needle-ai.com). Note that your key will be valid until you revoke it. Set the following env variable before you run your code:\n\n```\nexport NEEDLE_API_KEY=<your-api-key>\n```\n\n`NeedleClient` reads the API key from the environment by default. If you like to override this behaviour you can pass it in as a parameter. \n\n### Retrieve context from Needle\n\n```python\nfrom needle.v1 import NeedleClient\nfrom needle.v1.models import FileToAdd\n\n\nndl = NeedleClient()\ncollection = ndl.collections.create(name=\"Tech Trends\")\n\n# add file to collection\nfiles = ndl.collections.files.add(\n    collection_id=collection_id,\n    files=[\n        FileToAdd(\n            name=\"tech-radar-30.pdf\",\n            url=\"https://www.thoughtworks.com/content/dam/thoughtworks/documents/radar/2024/04/tr_technology_radar_vol_30_en.pdf\",\n        )\n    ],\n)\n\n# wait until indexing is complete\nfiles = ndl.collections.files.list(collection_id)\nif not all(f.status == \"indexed\" for f in files):\n    time.sleep(5)\n    files = ndl.collections.files.list(collection_id)\n\n# retrieve relevant context\nprompt = \"What techniques moved into adopt in this volume of technology radar?\"\nresults = ndl.collections.search(collection_id, text=prompt)\n```\n\nNeedle instantly extracts key points from your files.\n\n### Complete your RAG pipeline\n\nNaturally, to compose a human friendly answer use an LLM provider of your choice. For the demo purposes, we used OpenAI in this example:\n\n```python\nfrom openai import OpenAI\n\nsystem_messages = [{\"role\": \"system\", \"content\": r.content} for r in results] # results from Needle\nuser_message = {\n    \"role\": \"system\",\n    \"content\": f\"\"\"\n        Only answer the question based on the provided results data. \n        If there is no data in the provided data for the question, do not try to generate an answer.\n        This is the question: {prompt}\n\"\"\",\n}\n\nopenai_client = OpenAI()\nanswer = openai_client.chat.completions.create(\n    model=\"gpt-3.5-turbo\",\n    messages=[\n        *system_messages,\n        user_message,\n    ],\n)\n\nprint(answer.choices[0].message.content)\n# -> Retrieval-Augmented Generation (RAG) is the technique that moved into \"Adopt\" in this volume of the Technology Radar.\n```\n\nThis is one basic example of a RAG pipeline you can quicklu implement using Needle and OpenAI. Feel free to engineer more precise prompts and explore other prompting techniques such as chain-of-thoughts (CoT), graph of thoughts (GoT) etc. \n\nNeedle API helps you with hassle-free contextualization however does not limit you to a certain RAG technique. Let us know what you build in our [Discord channel](https://discord.gg/JzJcHgTyZx) :)\n\n## Exceptions \ud83e\udde8\n\nIf a request to Needle API fails, `needle.v1.models.Error` object will be thrown. There you can see a `message` and more details about the error.\n\n## Support \ud83d\udcde\n\nIf you have questions you can contact us in our [Discord channel](https://discord.gg/JzJcHgTyZx). \n\n# License\n\n`needle-python` is distributed under the terms of the MIT license.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Needle client library for Python",
    "version": "0.4.0",
    "project_urls": {
        "documentation": "https://docs.needle-ai.com",
        "homepage": "https://needle-ai.com",
        "issues": "https://github.com/oeken/needle-python/issues",
        "repository": "https://github.com/oeken/needle-python"
    },
    "split_keywords": [
        "needle",
        " api",
        " retrieval-augmented generation",
        " rag",
        " information-retrieval",
        " artificial intelligence",
        " ai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "30b666d5f63f01b5b51bf98422eed5628bfdcf2ff9eea55cac57327cde77faac",
                "md5": "e05167325661606b1356353c82b39a8d",
                "sha256": "373fc86e3e8f56636d69219ec8d5275bbb696622666cdad66a8f1f158e51edb5"
            },
            "downloads": -1,
            "filename": "needle_python-0.4.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "e05167325661606b1356353c82b39a8d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.8",
            "size": 9604,
            "upload_time": "2024-11-04T11:46:28",
            "upload_time_iso_8601": "2024-11-04T11:46:28.935241Z",
            "url": "https://files.pythonhosted.org/packages/30/b6/66d5f63f01b5b51bf98422eed5628bfdcf2ff9eea55cac57327cde77faac/needle_python-0.4.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "ed7e9df82b2d79f0d4ccc71f93fc1d716877c9bf028b808b5bdf4cc503f3e3b6",
                "md5": "d98ef793d20508ab5f5ba8eb24ca476f",
                "sha256": "471b5ae639afcdc8eaa84f59296cb87059ea5846f09810cab7eff1d20cb91d09"
            },
            "downloads": -1,
            "filename": "needle_python-0.4.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d98ef793d20508ab5f5ba8eb24ca476f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.8",
            "size": 8524,
            "upload_time": "2024-11-04T11:46:29",
            "upload_time_iso_8601": "2024-11-04T11:46:29.808413Z",
            "url": "https://files.pythonhosted.org/packages/ed/7e/9df82b2d79f0d4ccc71f93fc1d716877c9bf028b808b5bdf4cc503f3e3b6/needle_python-0.4.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-04 11:46:29",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "oeken",
    "github_project": "needle-python",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "needle-python"
}
        
Elapsed time: 1.79506s