# Ollama Query Engine Pack
Create a query engine using completely local by Ollama
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack OllamaQueryEnginePack --download-dir ./ollama_pack
```
You can then inspect the files at `./ollama_pack` and use them as a template for your own project.
## Code Usage
You can download the pack to a the `./ollama_pack` directory:
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
OllamaQueryEnginePack = download_llama_pack(
"OllamaQueryEnginePack", "./ollama_pack"
)
# You can use any llama-hub loader to get documents!
ollama_pack = OllamaQueryEnginePack(model="llama2", documents=documents)
```
From here, you can use the pack, or inspect and modify the pack in `./ollama_pack`.
The `run()` function is a light wrapper around `index.as_query_engine().query()`.
```python
response = ollama_pack.run("What is the title of the book of John?")
```
You can also use modules individually.
```python
# Use the llm
llm = ollama_pack.llm
response = llm.complete("What is Ollama?")
# Use the index directly
index = ollama_pack.index
query_engine = index.as_query_engine()
retriever = index.as_retriever()
```
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-packs-ollama-query-engine",
"maintainer": "chnsagitchen",
"docs_url": null,
"requires_python": "<4.0,>=3.9",
"maintainer_email": null,
"keywords": "engine, index, local, ollama, query",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/79/1f/160707c6e1e3d8f2c7daa9d7543f33e39e6750b96747ced7039d1c7e8583/llama_index_packs_ollama_query_engine-0.4.0.tar.gz",
"platform": null,
"description": "# Ollama Query Engine Pack\n\nCreate a query engine using completely local by Ollama\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack OllamaQueryEnginePack --download-dir ./ollama_pack\n```\n\nYou can then inspect the files at `./ollama_pack` and use them as a template for your own project.\n\n## Code Usage\n\nYou can download the pack to a the `./ollama_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nOllamaQueryEnginePack = download_llama_pack(\n \"OllamaQueryEnginePack\", \"./ollama_pack\"\n)\n\n# You can use any llama-hub loader to get documents!\nollama_pack = OllamaQueryEnginePack(model=\"llama2\", documents=documents)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./ollama_pack`.\n\nThe `run()` function is a light wrapper around `index.as_query_engine().query()`.\n\n```python\nresponse = ollama_pack.run(\"What is the title of the book of John?\")\n```\n\nYou can also use modules individually.\n\n```python\n# Use the llm\nllm = ollama_pack.llm\nresponse = llm.complete(\"What is Ollama?\")\n\n# Use the index directly\nindex = ollama_pack.index\nquery_engine = index.as_query_engine()\nretriever = index.as_retriever()\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs ollama_query_engine integration",
"version": "0.4.0",
"project_urls": null,
"split_keywords": [
"engine",
" index",
" local",
" ollama",
" query"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "e6a2a03e5fc24a634b3bd0a64a58bf35d3f7bfdf9509a6fed6ec17269f54d253",
"md5": "0cffe04dab56221b2972741016034fba",
"sha256": "ea13203c4983fb3c4349e0be5175efe738e02695b64beb63ae60910551109f2c"
},
"downloads": -1,
"filename": "llama_index_packs_ollama_query_engine-0.4.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0cffe04dab56221b2972741016034fba",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.9",
"size": 3595,
"upload_time": "2024-11-18T01:34:15",
"upload_time_iso_8601": "2024-11-18T01:34:15.198335Z",
"url": "https://files.pythonhosted.org/packages/e6/a2/a03e5fc24a634b3bd0a64a58bf35d3f7bfdf9509a6fed6ec17269f54d253/llama_index_packs_ollama_query_engine-0.4.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "791f160707c6e1e3d8f2c7daa9d7543f33e39e6750b96747ced7039d1c7e8583",
"md5": "8d0f2545fa442600fa2512a9a7f67e44",
"sha256": "b949e709bd4195af6a39a313374897687f2a94f265077624ba3f9cc597524c35"
},
"downloads": -1,
"filename": "llama_index_packs_ollama_query_engine-0.4.0.tar.gz",
"has_sig": false,
"md5_digest": "8d0f2545fa442600fa2512a9a7f67e44",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.9",
"size": 3272,
"upload_time": "2024-11-18T01:34:15",
"upload_time_iso_8601": "2024-11-18T01:34:15.999337Z",
"url": "https://files.pythonhosted.org/packages/79/1f/160707c6e1e3d8f2c7daa9d7543f33e39e6750b96747ced7039d1c7e8583/llama_index_packs_ollama_query_engine-0.4.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-18 01:34:15",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-ollama-query-engine"
}