llama-index-tools-vectara-query


Namellama-index-tools-vectara-query JSON
Version 0.4.1 PyPI version JSON
download
home_pageNone
Summaryllama-index tools vectara query integration
upload_time2025-09-08 20:49:24
maintainerNone
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseNone
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            ## Vectara Query Tool

This tool connects to a Vectara corpus and allows agents to make semantic search or retrieval augmented generation (RAG) queries.

## Usage

Please note that this usage example relies on version >=0.3.0.

This tool has a more extensive example usage documented in a Jupyter notebok [here](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb)

To use this tool, you'll need a Vectara account (If you don't have an account, you can create one [here](https://vectara.com/integrations/llamaindex)) and the following information in your environment:

- `VECTARA_CORPUS_KEY`: The corpus key for the Vectara corpus that you want your tool to search for information. If you need help creating a corpus with your data, follow this [Quick Start](https://docs.vectara.com/docs/quickstart) guide.
- `VECTARA_API_KEY`: An API key that can perform queries on this corpus.

Here's an example usage of the VectaraQueryToolSpec.

```python
from llama_index.tools.vectara_query import VectaraQueryToolSpec
from llama_index.core.agent.workflow import FunctionAgent
from llama_index.llms.openai import OpenAI

# Connecting to a Vectara corpus about Electric Vehicles
tool_spec = VectaraQueryToolSpec()

agent = FunctionAgent(
    tools=tool_spec.to_tool_list(),
    llm=OpenAI(model="gpt-4.1"),
)

print(await agent.run("What are the different types of electric vehicles?"))
```

The available tools are:

`semantic_search`: A tool that accepts a query and uses semantic search to obtain the top search results.

`rag_query`: A tool that accepts a query and uses RAG to obtain a generative response grounded in the search results.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-tools-vectara-query",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": null,
    "author_email": "David Oplatka <david.oplatka@vectara.com>",
    "download_url": "https://files.pythonhosted.org/packages/7a/f4/94281a5f46c486cfa8372187c17b613a9427680933463d3ad1c324378ac7/llama_index_tools_vectara_query-0.4.1.tar.gz",
    "platform": null,
    "description": "## Vectara Query Tool\n\nThis tool connects to a Vectara corpus and allows agents to make semantic search or retrieval augmented generation (RAG) queries.\n\n## Usage\n\nPlease note that this usage example relies on version >=0.3.0.\n\nThis tool has a more extensive example usage documented in a Jupyter notebok [here](https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/tools/llama-index-tools-vectara-query/examples/vectara_query.ipynb)\n\nTo use this tool, you'll need a Vectara account (If you don't have an account, you can create one [here](https://vectara.com/integrations/llamaindex)) and the following information in your environment:\n\n- `VECTARA_CORPUS_KEY`: The corpus key for the Vectara corpus that you want your tool to search for information. If you need help creating a corpus with your data, follow this [Quick Start](https://docs.vectara.com/docs/quickstart) guide.\n- `VECTARA_API_KEY`: An API key that can perform queries on this corpus.\n\nHere's an example usage of the VectaraQueryToolSpec.\n\n```python\nfrom llama_index.tools.vectara_query import VectaraQueryToolSpec\nfrom llama_index.core.agent.workflow import FunctionAgent\nfrom llama_index.llms.openai import OpenAI\n\n# Connecting to a Vectara corpus about Electric Vehicles\ntool_spec = VectaraQueryToolSpec()\n\nagent = FunctionAgent(\n    tools=tool_spec.to_tool_list(),\n    llm=OpenAI(model=\"gpt-4.1\"),\n)\n\nprint(await agent.run(\"What are the different types of electric vehicles?\"))\n```\n\nThe available tools are:\n\n`semantic_search`: A tool that accepts a query and uses semantic search to obtain the top search results.\n\n`rag_query`: A tool that accepts a query and uses RAG to obtain a generative response grounded in the search results.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "llama-index tools vectara query integration",
    "version": "0.4.1",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "4dfb49882e52878dc79aaf0c0929404f05439d6efd1c956491c9bc8b4b8226ef",
                "md5": "99557b3e4c0e584fe6c6cbc256657381",
                "sha256": "93ea6194d2d452a1aa8aecbf71af0c8cf876d6b81aa72470c77af6b88bdfc733"
            },
            "downloads": -1,
            "filename": "llama_index_tools_vectara_query-0.4.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "99557b3e4c0e584fe6c6cbc256657381",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 6077,
            "upload_time": "2025-09-08T20:49:24",
            "upload_time_iso_8601": "2025-09-08T20:49:24.107816Z",
            "url": "https://files.pythonhosted.org/packages/4d/fb/49882e52878dc79aaf0c0929404f05439d6efd1c956491c9bc8b4b8226ef/llama_index_tools_vectara_query-0.4.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "7af494281a5f46c486cfa8372187c17b613a9427680933463d3ad1c324378ac7",
                "md5": "0b2e5e94620898b9f81066d7b6e6e5fe",
                "sha256": "28fb7071ad05175375bdd1a8da09fa31299acf3d87a48ce5efa78cf3be603a3e"
            },
            "downloads": -1,
            "filename": "llama_index_tools_vectara_query-0.4.1.tar.gz",
            "has_sig": false,
            "md5_digest": "0b2e5e94620898b9f81066d7b6e6e5fe",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 6326,
            "upload_time": "2025-09-08T20:49:24",
            "upload_time_iso_8601": "2025-09-08T20:49:24.833279Z",
            "url": "https://files.pythonhosted.org/packages/7a/f4/94281a5f46c486cfa8372187c17b613a9427680933463d3ad1c324378ac7/llama_index_tools_vectara_query-0.4.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-09-08 20:49:24",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-tools-vectara-query"
}
        
Elapsed time: 9.67702s