# Ollama Query Engine Pack
Create a query engine using completely local by Ollama
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack OllamaQueryEnginePack --download-dir ./ollama_pack
```
You can then inspect the files at `./ollama_pack` and use them as a template for your own project.
## Code Usage
You can download the pack to a the `./ollama_pack` directory:
```python
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
OllamaQueryEnginePack = download_llama_pack(
"OllamaQueryEnginePack", "./ollama_pack"
)
# You can use any llama-hub loader to get documents!
ollama_pack = OllamaQueryEnginePack(model="llama2", documents=documents)
```
From here, you can use the pack, or inspect and modify the pack in `./ollama_pack`.
The `run()` function is a light wrapper around `index.as_query_engine().query()`.
```python
response = ollama_pack.run("What is the title of the book of John?")
```
You can also use modules individually.
```python
# Use the llm
llm = ollama_pack.llm
response = llm.complete("What is Ollama?")
# Use the index directly
index = ollama_pack.index
query_engine = index.as_query_engine()
retriever = index.as_retriever()
```
Raw data
{
"_id": null,
"home_page": "",
"name": "llama-index-packs-ollama-query-engine",
"maintainer": "chnsagitchen",
"docs_url": null,
"requires_python": ">=3.8.1,<4.0",
"maintainer_email": "",
"keywords": "engine,index,local,ollama,query",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/9d/a7/f36d3b662a7b346cb9079bce7356fda859536741a9042ff561e8d1a67829/llama_index_packs_ollama_query_engine-0.1.3.tar.gz",
"platform": null,
"description": "# Ollama Query Engine Pack\n\nCreate a query engine using completely local by Ollama\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack OllamaQueryEnginePack --download-dir ./ollama_pack\n```\n\nYou can then inspect the files at `./ollama_pack` and use them as a template for your own project.\n\n## Code Usage\n\nYou can download the pack to a the `./ollama_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nOllamaQueryEnginePack = download_llama_pack(\n \"OllamaQueryEnginePack\", \"./ollama_pack\"\n)\n\n# You can use any llama-hub loader to get documents!\nollama_pack = OllamaQueryEnginePack(model=\"llama2\", documents=documents)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./ollama_pack`.\n\nThe `run()` function is a light wrapper around `index.as_query_engine().query()`.\n\n```python\nresponse = ollama_pack.run(\"What is the title of the book of John?\")\n```\n\nYou can also use modules individually.\n\n```python\n# Use the llm\nllm = ollama_pack.llm\nresponse = llm.complete(\"What is Ollama?\")\n\n# Use the index directly\nindex = ollama_pack.index\nquery_engine = index.as_query_engine()\nretriever = index.as_retriever()\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs ollama_query_engine integration",
"version": "0.1.3",
"project_urls": null,
"split_keywords": [
"engine",
"index",
"local",
"ollama",
"query"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "7148aaec77b082c3ee765d582682bb6d58fe32c12903eafcc927c9391934e41a",
"md5": "8dc1ef2d2e240951f2486f2b799a8019",
"sha256": "00003ada6d9e21fccbe602ae30cab426391f7ce66084a7381b6ca1491ed0c260"
},
"downloads": -1,
"filename": "llama_index_packs_ollama_query_engine-0.1.3-py3-none-any.whl",
"has_sig": false,
"md5_digest": "8dc1ef2d2e240951f2486f2b799a8019",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.8.1,<4.0",
"size": 3630,
"upload_time": "2024-02-22T01:27:03",
"upload_time_iso_8601": "2024-02-22T01:27:03.999325Z",
"url": "https://files.pythonhosted.org/packages/71/48/aaec77b082c3ee765d582682bb6d58fe32c12903eafcc927c9391934e41a/llama_index_packs_ollama_query_engine-0.1.3-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "9da7f36d3b662a7b346cb9079bce7356fda859536741a9042ff561e8d1a67829",
"md5": "f1a4a8219d1bc501ad534e6719902941",
"sha256": "3c0872bba4916dff2d21bae79303640027a0fd29de3aea57bb6b81b856480cfa"
},
"downloads": -1,
"filename": "llama_index_packs_ollama_query_engine-0.1.3.tar.gz",
"has_sig": false,
"md5_digest": "f1a4a8219d1bc501ad534e6719902941",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.8.1,<4.0",
"size": 3303,
"upload_time": "2024-02-22T01:27:07",
"upload_time_iso_8601": "2024-02-22T01:27:07.463057Z",
"url": "https://files.pythonhosted.org/packages/9d/a7/f36d3b662a7b346cb9079bce7356fda859536741a9042ff561e8d1a67829/llama_index_packs_ollama_query_engine-0.1.3.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-02-22 01:27:07",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-ollama-query-engine"
}