# Cohere Citations Chat Engine Pack
Creates and runs a custom `VectorStoreIndexWithCitationsChat` -- which provides the chat engine with documents/citation mode.
See the documentation [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag) and [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag).
## CLI Usage
You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:
```bash
llamaindex-cli download-llamapack CohereCitationChatEnginePack --download-dir ./cohere_citation_chat_pack
```
You can then inspect the files at `./cohere_citation_chat_pack` and use them as a template for your own project!
You can also directly install it if you don't want to look at/inspect the source code:
```bash
pip install llama-index-packs-cohere-citation-chat
```
## Code Usage
You can download the pack to the `./cohere_citation_chat_pack` directory:
```python
from llama_index.readers.web import SimpleWebPageReader
from llama_index.core.llama_pack import download_llama_pack
# download and install dependencies
CohereCitationChatEnginePack = download_llama_pack(
"CohereCitationChatEnginePack", "./cohere_citation_chat_pack"
)
documents = SimpleWebPageReader().load_data(
[
"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt"
]
)
cohere_citation_chat_pack = CohereCitationChatEnginePack(
documents=documents, cohere_api_key="your-api-key"
)
chat_engine = cohere_citation_chat_pack.run()
response = chat_engine.chat("What can you tell me about LLMs?")
# print chat response
print(response)
# print documents
print(response.documents)
# print citations
print(response.citations)
```
See the [notebook on llama](https://github.com/run-llama/llama_index/blob/main/llama-index-packs/llama-index-packs-cohere-citation-chat/examples/cohere_citation_chat_example.ipynb) for a full example.
Raw data
{
"_id": null,
"home_page": null,
"name": "llama-index-packs-cohere-citation-chat",
"maintainer": "EugeneLightsOn",
"docs_url": null,
"requires_python": "<4.0,>=3.8.1",
"maintainer_email": null,
"keywords": "chat, citation, cite, cohere, engine, index",
"author": "Your Name",
"author_email": "you@example.com",
"download_url": "https://files.pythonhosted.org/packages/fa/df/e19a6b5ec0ac0e81488e0964adb3b980ba5a35b1a293044aa8ff9826c60b/llama_index_packs_cohere_citation_chat-0.3.0.tar.gz",
"platform": null,
"description": "# Cohere Citations Chat Engine Pack\n\nCreates and runs a custom `VectorStoreIndexWithCitationsChat` -- which provides the chat engine with documents/citation mode.\nSee the documentation [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag) and [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack CohereCitationChatEnginePack --download-dir ./cohere_citation_chat_pack\n```\n\nYou can then inspect the files at `./cohere_citation_chat_pack` and use them as a template for your own project!\n\nYou can also directly install it if you don't want to look at/inspect the source code:\n\n```bash\npip install llama-index-packs-cohere-citation-chat\n```\n\n## Code Usage\n\nYou can download the pack to the `./cohere_citation_chat_pack` directory:\n\n```python\nfrom llama_index.readers.web import SimpleWebPageReader\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nCohereCitationChatEnginePack = download_llama_pack(\n \"CohereCitationChatEnginePack\", \"./cohere_citation_chat_pack\"\n)\ndocuments = SimpleWebPageReader().load_data(\n [\n \"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt\"\n ]\n)\ncohere_citation_chat_pack = CohereCitationChatEnginePack(\n documents=documents, cohere_api_key=\"your-api-key\"\n)\nchat_engine = cohere_citation_chat_pack.run()\nresponse = chat_engine.chat(\"What can you tell me about LLMs?\")\n\n# print chat response\nprint(response)\n\n# print documents\nprint(response.documents)\n\n# print citations\nprint(response.citations)\n```\n\nSee the [notebook on llama](https://github.com/run-llama/llama_index/blob/main/llama-index-packs/llama-index-packs-cohere-citation-chat/examples/cohere_citation_chat_example.ipynb) for a full example.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "llama-index packs cohere_citation_chat integration",
"version": "0.3.0",
"project_urls": null,
"split_keywords": [
"chat",
" citation",
" cite",
" cohere",
" engine",
" index"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "df3b313ef07428e9e567b8de0fbde5d84f3d979a400861cb309e16eb6eada96c",
"md5": "1fe45468547aa34ab555d6e9b154a321",
"sha256": "3392d00ce8c03956036182824ad0fb5b086e9865df3b5841451485eb07a2332b"
},
"downloads": -1,
"filename": "llama_index_packs_cohere_citation_chat-0.3.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "1fe45468547aa34ab555d6e9b154a321",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.8.1",
"size": 7831,
"upload_time": "2024-08-22T16:45:10",
"upload_time_iso_8601": "2024-08-22T16:45:10.849091Z",
"url": "https://files.pythonhosted.org/packages/df/3b/313ef07428e9e567b8de0fbde5d84f3d979a400861cb309e16eb6eada96c/llama_index_packs_cohere_citation_chat-0.3.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "fadfe19a6b5ec0ac0e81488e0964adb3b980ba5a35b1a293044aa8ff9826c60b",
"md5": "c0cd27795dcbbb4b9502a6ff767e70f2",
"sha256": "4e7efe599da561f8b4f075096fe4b463664966cb09b58280b6851148cc8130c3"
},
"downloads": -1,
"filename": "llama_index_packs_cohere_citation_chat-0.3.0.tar.gz",
"has_sig": false,
"md5_digest": "c0cd27795dcbbb4b9502a6ff767e70f2",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.8.1",
"size": 7071,
"upload_time": "2024-08-22T16:45:11",
"upload_time_iso_8601": "2024-08-22T16:45:11.965040Z",
"url": "https://files.pythonhosted.org/packages/fa/df/e19a6b5ec0ac0e81488e0964adb3b980ba5a35b1a293044aa8ff9826c60b/llama_index_packs_cohere_citation_chat-0.3.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-08-22 16:45:11",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llama-index-packs-cohere-citation-chat"
}