llama-index-packs-cohere-citation-chat


Namellama-index-packs-cohere-citation-chat JSON
Version 0.5.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs cohere_citation_chat integration
upload_time2025-07-31 02:51:33
maintainerEugeneLightsOn
docs_urlNone
authorNone
requires_python<4.0,>=3.9
licenseNone
keywords chat citation cite cohere engine index
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Cohere Citations Chat Engine Pack

Creates and runs a custom `VectorStoreIndexWithCitationsChat` -- which provides the chat engine with documents/citation mode.
See the documentation [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag) and [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag).

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack CohereCitationChatEnginePack --download-dir ./cohere_citation_chat_pack
```

You can then inspect the files at `./cohere_citation_chat_pack` and use them as a template for your own project!

You can also directly install it if you don't want to look at/inspect the source code:

```bash
pip install llama-index-packs-cohere-citation-chat
```

## Code Usage

You can download the pack to the `./cohere_citation_chat_pack` directory:

```python
from llama_index.readers.web import SimpleWebPageReader
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
CohereCitationChatEnginePack = download_llama_pack(
    "CohereCitationChatEnginePack", "./cohere_citation_chat_pack"
)
documents = SimpleWebPageReader().load_data(
    [
        "https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt"
    ]
)
cohere_citation_chat_pack = CohereCitationChatEnginePack(
    documents=documents, cohere_api_key="your-api-key"
)
chat_engine = cohere_citation_chat_pack.run()
response = chat_engine.chat("What can you tell me about LLMs?")

# print chat response
print(response)

# print documents
print(response.documents)

# print citations
print(response.citations)
```

See the [notebook on llama](https://github.com/run-llama/llama_index/blob/main/llama-index-packs/llama-index-packs-cohere-citation-chat/examples/cohere_citation_chat_example.ipynb) for a full example.

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-cohere-citation-chat",
    "maintainer": "EugeneLightsOn",
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "chat, citation, cite, cohere, engine, index",
    "author": null,
    "author_email": "Your Name <you@example.com>",
    "download_url": "https://files.pythonhosted.org/packages/a3/46/56a501dfd5c56676131ff0a510604e10701b0fac21ea5fca03db76898e32/llama_index_packs_cohere_citation_chat-0.5.0.tar.gz",
    "platform": null,
    "description": "# Cohere Citations Chat Engine Pack\n\nCreates and runs a custom `VectorStoreIndexWithCitationsChat` -- which provides the chat engine with documents/citation mode.\nSee the documentation [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag) and [here](https://docs.cohere.com/docs/retrieval-augmented-generation-rag).\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack CohereCitationChatEnginePack --download-dir ./cohere_citation_chat_pack\n```\n\nYou can then inspect the files at `./cohere_citation_chat_pack` and use them as a template for your own project!\n\nYou can also directly install it if you don't want to look at/inspect the source code:\n\n```bash\npip install llama-index-packs-cohere-citation-chat\n```\n\n## Code Usage\n\nYou can download the pack to the `./cohere_citation_chat_pack` directory:\n\n```python\nfrom llama_index.readers.web import SimpleWebPageReader\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nCohereCitationChatEnginePack = download_llama_pack(\n    \"CohereCitationChatEnginePack\", \"./cohere_citation_chat_pack\"\n)\ndocuments = SimpleWebPageReader().load_data(\n    [\n        \"https://raw.githubusercontent.com/jerryjliu/llama_index/adb054429f642cc7bbfcb66d4c232e072325eeab/examples/paul_graham_essay/data/paul_graham_essay.txt\"\n    ]\n)\ncohere_citation_chat_pack = CohereCitationChatEnginePack(\n    documents=documents, cohere_api_key=\"your-api-key\"\n)\nchat_engine = cohere_citation_chat_pack.run()\nresponse = chat_engine.chat(\"What can you tell me about LLMs?\")\n\n# print chat response\nprint(response)\n\n# print documents\nprint(response.documents)\n\n# print citations\nprint(response.citations)\n```\n\nSee the [notebook on llama](https://github.com/run-llama/llama_index/blob/main/llama-index-packs/llama-index-packs-cohere-citation-chat/examples/cohere_citation_chat_example.ipynb) for a full example.\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "llama-index packs cohere_citation_chat integration",
    "version": "0.5.0",
    "project_urls": null,
    "split_keywords": [
        "chat",
        " citation",
        " cite",
        " cohere",
        " engine",
        " index"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "4e13327001fe03d8cac6ced0fe88311bd9be99fe5576829e5294b22f610e6a3b",
                "md5": "f79c0b59ecc07f48f2f27e99e68b7cfd",
                "sha256": "d5ded425c1b9677204f458a12ca5d6f405d73460c8173ee1cc6e5033dd324611"
            },
            "downloads": -1,
            "filename": "llama_index_packs_cohere_citation_chat-0.5.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "f79c0b59ecc07f48f2f27e99e68b7cfd",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 8666,
            "upload_time": "2025-07-31T02:51:32",
            "upload_time_iso_8601": "2025-07-31T02:51:32.743452Z",
            "url": "https://files.pythonhosted.org/packages/4e/13/327001fe03d8cac6ced0fe88311bd9be99fe5576829e5294b22f610e6a3b/llama_index_packs_cohere_citation_chat-0.5.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "a34656a501dfd5c56676131ff0a510604e10701b0fac21ea5fca03db76898e32",
                "md5": "dc6088b3e33a950e54e0c80840c366fc",
                "sha256": "fbaefba0af7df1b7245ca35922ca63859706a9772d5f19a41f46b387c8416748"
            },
            "downloads": -1,
            "filename": "llama_index_packs_cohere_citation_chat-0.5.0.tar.gz",
            "has_sig": false,
            "md5_digest": "dc6088b3e33a950e54e0c80840c366fc",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 7732,
            "upload_time": "2025-07-31T02:51:33",
            "upload_time_iso_8601": "2025-07-31T02:51:33.762602Z",
            "url": "https://files.pythonhosted.org/packages/a3/46/56a501dfd5c56676131ff0a510604e10701b0fac21ea5fca03db76898e32/llama_index_packs_cohere_citation_chat-0.5.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-31 02:51:33",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-cohere-citation-chat"
}
        
Elapsed time: 0.58905s