llama-index-tools-vectara-query


Namellama-index-tools-vectara-query JSON
Version 0.2.0 PyPI version JSON
download
home_pageNone
Summaryllama-index tools vectara query integration
upload_time2024-11-18 00:57:32
maintainerNone
docs_urlNone
authorDavid Oplatka
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ## Vectara Query Tool

This tool connects to a Vectara corpus and allows agents to make semantic search or retrieval augmented generation (RAG) queries.

## Usage

This tool has a more extensive example usage documented in a Jupyter notebok [here](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb)

To use this tool, you'll need the following information in your environment:

- `VECTARA_CUSTOMER_ID`: The customer id for your Vectara account. If you don't have an account, you can create one [here](https://vectara.com/integrations/llamaindex).
- `VECTARA_CORPUS_ID`: The corpus id for the Vectara corpus that you want your tool to search for information. If you need help creating a corpus with your data, follow this [Quick Start](https://docs.vectara.com/docs/quickstart) guide.
- `VECTARA_API_KEY`: An API key that can perform queries on this corpus.

Here's an example usage of the VectaraQueryToolSpec.

```python
from llama_index.tools.vectara_query import VectaraQueryToolSpec
from llama_index.agent.openai import OpenAIAgent

# Connecting to a Vectara corpus about Electric Vehicles
tool_spec = VectaraQueryToolSpec()

agent = OpenAIAgent.from_tools(tool_spec.to_tool_list())

agent.chat("What are the different types of electric vehicles?")
```

The available tools are:

`semantic_search`: A tool that accepts a query and uses semantic search to obtain the top search results.

`rag_query`: A tool that accepts a query and uses RAG to obtain a generative response grounded in the search results.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-tools-vectara-query",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "David Oplatka",
    "author_email": "david.oplatka@vectara.com",
    "download_url": "https://files.pythonhosted.org/packages/e2/4a/20098d87c1d177e568a3ce2d707f57317b3538927cb25a4963b67d33c7d3/llama_index_tools_vectara_query-0.2.0.tar.gz",
    "platform": null,
    "description": "## Vectara Query Tool\n\nThis tool connects to a Vectara corpus and allows agents to make semantic search or retrieval augmented generation (RAG) queries.\n\n## Usage\n\nThis tool has a more extensive example usage documented in a Jupyter notebok [here](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb)\n\nTo use this tool, you'll need the following information in your environment:\n\n- `VECTARA_CUSTOMER_ID`: The customer id for your Vectara account. If you don't have an account, you can create one [here](https://vectara.com/integrations/llamaindex).\n- `VECTARA_CORPUS_ID`: The corpus id for the Vectara corpus that you want your tool to search for information. If you need help creating a corpus with your data, follow this [Quick Start](https://docs.vectara.com/docs/quickstart) guide.\n- `VECTARA_API_KEY`: An API key that can perform queries on this corpus.\n\nHere's an example usage of the VectaraQueryToolSpec.\n\n```python\nfrom llama_index.tools.vectara_query import VectaraQueryToolSpec\nfrom llama_index.agent.openai import OpenAIAgent\n\n# Connecting to a Vectara corpus about Electric Vehicles\ntool_spec = VectaraQueryToolSpec()\n\nagent = OpenAIAgent.from_tools(tool_spec.to_tool_list())\n\nagent.chat(\"What are the different types of electric vehicles?\")\n```\n\nThe available tools are:\n\n`semantic_search`: A tool that accepts a query and uses semantic search to obtain the top search results.\n\n`rag_query`: A tool that accepts a query and uses RAG to obtain a generative response grounded in the search results.\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index tools vectara query integration",
    "version": "0.2.0",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e7c305ce3759bbc4e970536ae148d977bcbaed38985312ef981e12685e800a11",
                "md5": "d0b0349709c0e7457a022aabd4c44345",
                "sha256": "a930c4e158e42e1b1137af71f391495046893e0464e0242f4872b68be4e9ce89"
            },
            "downloads": -1,
            "filename": "llama_index_tools_vectara_query-0.2.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "d0b0349709c0e7457a022aabd4c44345",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4482,
            "upload_time": "2024-11-18T00:57:31",
            "upload_time_iso_8601": "2024-11-18T00:57:31.445222Z",
            "url": "https://files.pythonhosted.org/packages/e7/c3/05ce3759bbc4e970536ae148d977bcbaed38985312ef981e12685e800a11/llama_index_tools_vectara_query-0.2.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e24a20098d87c1d177e568a3ce2d707f57317b3538927cb25a4963b67d33c7d3",
                "md5": "00463a5eb524c1bdff7e6e812baab041",
                "sha256": "7591d4c1065202f0b1a636c9dbb079d2b95593c76b795be32a7f236310058cdf"
            },
            "downloads": -1,
            "filename": "llama_index_tools_vectara_query-0.2.0.tar.gz",
            "has_sig": false,
            "md5_digest": "00463a5eb524c1bdff7e6e812baab041",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4145,
            "upload_time": "2024-11-18T00:57:32",
            "upload_time_iso_8601": "2024-11-18T00:57:32.990345Z",
            "url": "https://files.pythonhosted.org/packages/e2/4a/20098d87c1d177e568a3ce2d707f57317b3538927cb25a4963b67d33c7d3/llama_index_tools_vectara_query-0.2.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-18 00:57:32",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-tools-vectara-query"
}
        
Elapsed time: 0.33948s