# LiveKit Plugins Llama Index
Agent Framework plugin for using Llama Index. Currently supports [Query Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/query_engine/) and [Chat Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/).
## Install
```bash
pip install livekit-plugins-llama-index
```
## Query Engine
Query Engine is primarily used for RAG. See [example voice agent](https://github.com/livekit/agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/query_engine.py)
## Chat Engine
Chat Engine can be used as an LLM within the framework.
```python
# load the existing index
storage_context = StorageContext.from_defaults(persist_dir=<mydir>)
index = load_index_from_storage(storage_context)
async def entrypoint(ctx: JobContext):
...
chat_engine = index.as_chat_engine(chat_mode=ChatMode.CONTEXT)
assistant = VoicePipelineAgent(
vad=silero.VAD.load(),
stt=deepgram.STT(),
llm=llama_index.LLM(chat_engine=chat_engine),
tts=openai.TTS(),
chat_ctx=initial_ctx,
)
```
full example [here](https://github.com/livekit/agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/chat_engine.py)
Raw data
{
"_id": null,
"home_page": "https://github.com/livekit/agents",
"name": "livekit-plugins-llama-index",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9.0",
"maintainer_email": null,
"keywords": "webrtc, realtime, audio, video, livekit",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/15/f3/1ad145bf4f4d22914df9c3703bf4ad0a762fdb799f985835c199c96e05ec/livekit_plugins_llama_index-0.2.2.tar.gz",
"platform": null,
"description": "# LiveKit Plugins Llama Index\n\nAgent Framework plugin for using Llama Index. Currently supports [Query Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/query_engine/) and [Chat Engine](https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/).\n\n## Install\n\n```bash\npip install livekit-plugins-llama-index\n```\n\n## Query Engine\n\nQuery Engine is primarily used for RAG. See [example voice agent](https://github.com/livekit/agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/query_engine.py)\n\n## Chat Engine\n\nChat Engine can be used as an LLM within the framework.\n\n```python\n# load the existing index\nstorage_context = StorageContext.from_defaults(persist_dir=<mydir>)\nindex = load_index_from_storage(storage_context)\n\nasync def entrypoint(ctx: JobContext):\n ...\n chat_engine = index.as_chat_engine(chat_mode=ChatMode.CONTEXT)\n assistant = VoicePipelineAgent(\n vad=silero.VAD.load(),\n stt=deepgram.STT(),\n llm=llama_index.LLM(chat_engine=chat_engine),\n tts=openai.TTS(),\n chat_ctx=initial_ctx,\n )\n```\n\nfull example [here](https://github.com/livekit/agents/blob/main/examples/voice-pipeline-agent/llamaindex-rag/chat_engine.py)\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "Llama Index plugin for LiveKit Agents",
"version": "0.2.2",
"project_urls": {
"Documentation": "https://docs.livekit.io",
"Homepage": "https://github.com/livekit/agents",
"Source": "https://github.com/livekit/agents",
"Website": "https://livekit.io/"
},
"split_keywords": [
"webrtc",
" realtime",
" audio",
" video",
" livekit"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b0c16f0a66becff4d24c34e0a66c8bdcf683a377fa0d629a7e88af9a9dae36be",
"md5": "646254d31d8d37654e31d1582b08cee4",
"sha256": "6f721a57d9860ab61593665ef19b9631d511128cc0a741f478749bd8f913290a"
},
"downloads": -1,
"filename": "livekit_plugins_llama_index-0.2.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "646254d31d8d37654e31d1582b08cee4",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9.0",
"size": 4855,
"upload_time": "2024-12-04T17:13:28",
"upload_time_iso_8601": "2024-12-04T17:13:28.623526Z",
"url": "https://files.pythonhosted.org/packages/b0/c1/6f0a66becff4d24c34e0a66c8bdcf683a377fa0d629a7e88af9a9dae36be/livekit_plugins_llama_index-0.2.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "15f31ad145bf4f4d22914df9c3703bf4ad0a762fdb799f985835c199c96e05ec",
"md5": "d3b743f2347a98e0b0b7d065ee55a919",
"sha256": "c6dfa21fc576ffb3421d166a76b9bdf82a417c6d2caa2814527bb6d582d49836"
},
"downloads": -1,
"filename": "livekit_plugins_llama_index-0.2.2.tar.gz",
"has_sig": false,
"md5_digest": "d3b743f2347a98e0b0b7d065ee55a919",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9.0",
"size": 4091,
"upload_time": "2024-12-04T17:13:41",
"upload_time_iso_8601": "2024-12-04T17:13:41.434781Z",
"url": "https://files.pythonhosted.org/packages/15/f3/1ad145bf4f4d22914df9c3703bf4ad0a762fdb799f985835c199c96e05ec/livekit_plugins_llama_index-0.2.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-04 17:13:41",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "livekit",
"github_project": "agents",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "livekit-plugins-llama-index"
}