# Wirtual Plugins Llama Index
Agent Framework plugin for using Llama Index. Currently supports [Query Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/query_engine/) and [Chat Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/).
## Install
```bash
pip install wirtual-plugins-llama-index
```
## Query Engine
Query Engine is primarily used for RAG. See [example voice agent](https://github.com/wirtualdev/wirtual-agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/query_engine.py)
## Chat Engine
Chat Engine can be used as an LLM within the framework.
```python
# load the existing index
storage_context = StorageContext.from_defaults(persist_dir=<mydir>)
index = load_index_from_storage(storage_context)
async def entrypoint(ctx: JobContext):
...
chat_engine = index.as_chat_engine(chat_mode=ChatMode.CONTEXT)
assistant = VoicePipelineAgent(
vad=silero.VAD.load(),
stt=deepgram.STT(),
llm=llama_index.LLM(chat_engine=chat_engine),
tts=openai.TTS(),
chat_ctx=initial_ctx,
)
```
full example [here](https://github.com/wirtualdev/wirtual-agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/chat_engine.py)
Raw data
{
"_id": null,
"home_page": "https://github.com/wirtualdev/wirtual-agents",
"name": "wirtual-plugins-llama-index",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9.0",
"maintainer_email": null,
"keywords": "webrtc, realtime, audio, video, wirtual",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/b5/39/b5e4212777e1f2cf38d90474fadcd6d63868369c0c20aee5fc1eabf986ed/wirtual_plugins_llama_index-0.0.1.tar.gz",
"platform": null,
"description": "# Wirtual Plugins Llama Index\n\nAgent Framework plugin for using Llama Index. Currently supports [Query Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/query_engine/) and [Chat Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/).\n\n## Install\n\n```bash\npip install wirtual-plugins-llama-index\n```\n\n## Query Engine\n\nQuery Engine is primarily used for RAG. See [example voice agent](https://github.com/wirtualdev/wirtual-agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/query_engine.py)\n\n## Chat Engine\n\nChat Engine can be used as an LLM within the framework.\n\n```python\n# load the existing index\nstorage_context = StorageContext.from_defaults(persist_dir=<mydir>)\nindex = load_index_from_storage(storage_context)\n\nasync def entrypoint(ctx: JobContext):\n ...\n chat_engine = index.as_chat_engine(chat_mode=ChatMode.CONTEXT)\n assistant = VoicePipelineAgent(\n vad=silero.VAD.load(),\n stt=deepgram.STT(),\n llm=llama_index.LLM(chat_engine=chat_engine),\n tts=openai.TTS(),\n chat_ctx=initial_ctx,\n )\n```\n\nfull example [here](https://github.com/wirtualdev/wirtual-agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/chat_engine.py)\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Llama Index plugin for Wirtual Agents",
"version": "0.0.1",
"project_urls": {
"Documentation": "https://docs.wirtual.dev",
"Homepage": "https://github.com/wirtualdev/wirtual-agents",
"Source": "https://github.com/wirtualdev/wirtual-agents",
"Website": "https://wirtual.dev/"
},
"split_keywords": [
"webrtc",
" realtime",
" audio",
" video",
" wirtual"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "2b2ae33d8734ff057783965394ef09cb5329eab0e9ad92dcf10454b08a29702b",
"md5": "528003e5922d8fbda80573ed5a545e18",
"sha256": "9401c344fd5276e964f06b867dd0c9ccbf9e9d790b63b70f40600fc8a8aef4b4"
},
"downloads": -1,
"filename": "wirtual_plugins_llama_index-0.0.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "528003e5922d8fbda80573ed5a545e18",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9.0",
"size": 4876,
"upload_time": "2024-12-29T16:46:50",
"upload_time_iso_8601": "2024-12-29T16:46:50.892817Z",
"url": "https://files.pythonhosted.org/packages/2b/2a/e33d8734ff057783965394ef09cb5329eab0e9ad92dcf10454b08a29702b/wirtual_plugins_llama_index-0.0.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "b539b5e4212777e1f2cf38d90474fadcd6d63868369c0c20aee5fc1eabf986ed",
"md5": "2fe103c4feaddc785b250bbd03217cac",
"sha256": "2c6511ff1de114f658fe6e59981dd6ef8d64a3c2e6dca42c005170d6d5bc979b"
},
"downloads": -1,
"filename": "wirtual_plugins_llama_index-0.0.1.tar.gz",
"has_sig": false,
"md5_digest": "2fe103c4feaddc785b250bbd03217cac",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9.0",
"size": 4334,
"upload_time": "2024-12-29T16:46:53",
"upload_time_iso_8601": "2024-12-29T16:46:53.373283Z",
"url": "https://files.pythonhosted.org/packages/b5/39/b5e4212777e1f2cf38d90474fadcd6d63868369c0c20aee5fc1eabf986ed/wirtual_plugins_llama_index-0.0.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-29 16:46:53",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "wirtualdev",
"github_project": "wirtual-agents",
"github_not_found": true,
"lcname": "wirtual-plugins-llama-index"
}