llama-index-packs-mixture-of-agents


Namellama-index-packs-mixture-of-agents JSON
Version 0.3.0 PyPI version JSON
download
home_pageNone
Summaryllama-index packs mixture_of_agents paper implementation
upload_time2024-11-17 22:42:32
maintainerravi03071991
docs_urlNone
authorRavi Theja
requires_python<4.0,>=3.9
licenseMIT
keywords agents llms query togetherai
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Mixture-Of-Agents Pack

Implementation Of [Mixture-Of-Agents](https://arxiv.org/abs/2406.04692) paper from TogetherAI as LlamaPack.

Disclaimer: While the paper named the method "Mixture of Agents", agents appear to refer to LLMs themselves, not actual agentic behaviour

### Approach

The capabilities of LLMs have advanced significantly, and there is now a growing number of these models available. To maximize their potential, we need to harness the collective expertise of multiple LLMs. This is where the Mixture-of-Agents (MoA) approach comes in.

The MoA approach is a layered architecture where each layer consists of multiple LLM agents. These agents collaborate by taking the outputs of other agents in the previous layer as auxiliary information to generate their responses. This collaboration allows for the refinement and enhancement of responses, as agents build upon each other's strengths. The process can be categorized into two roles: Proposers(base LLM), who generate diverse context and perspectives, and Aggregators(reference LLMs), who synthesize these proposals into a single, high-quality output. By introducing additional aggregators and iteratively refining the responses, the MoA approach aims to maximize the collaborative potential of multiple LLMs, leading to superior outcomes.

## CLI Usage

You can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:

```bash
llamaindex-cli download-llamapack MixtureOfAgentsPack --download-dir ./mixture_of_agents_pack
```

You can then inspect the files at `./mixture_of_agents_pack` and use them as a template for your own project.

## Code Usage

You can use LlamaPack in the following ways:

1. Install the LlamaPack.
2. Download the LlamaPack.

### 1. Install the LlamaPack:

```bash
pip install llama-index-packs-mixture-of-agents
```

### 2. Download LlamaPack:

You can download the pack to a the `./mixture_of_agents_pack` directory:

```python
from llama_index.core.llama_pack import download_llama_pack

# download and install dependencies
MixtureOfAgentsPack = download_llama_pack(
    "MixtureOfAgentsPack", "./mixture_of_agents_pack"
)
```

Once installed or downloaded, you can use the LlamaPack as follows:

```python
# Necessary for async operations in Jupyter notebooks
import nest_asyncio

nest_asyncio.apply()

from llama_index.llms.openai import OpenAI
from llama_index.llms.mistralai import MistralAI

# Add OPENAI_API_KEY and MISTRAL_API_KEY to your env variable

mixture_of_agents_pack = MixtureOfAgentsPack(
    llm=OpenAI(model="gpt-4"),  # Aggregator
    reference_llms=[
        OpenAI(model="gpt-3.5-turbo"),
        MistralAI(model="mistral-medium"),
    ],  # Proposers
    num_layers=3,
    temperature=0.1,
    timeout=200,  # timeout for response from workflow
)
```

From here, you can use the pack, or inspect and modify the pack in `./mixture_of_agents_pack`.

The `run()` function is a light wrapper around the proposed approach in the paper.

```python
response = mixture_of_agents_pack.run("What is LlamaIndex?")
```

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-packs-mixture-of-agents",
    "maintainer": "ravi03071991",
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": "agents, llms, query, togetherai",
    "author": "Ravi Theja",
    "author_email": "ravi03071991@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/a8/4f/b4921b52deb89eabf15e7b4f131019fd0f195c7afae00968c23c2ff516c7/llama_index_packs_mixture_of_agents-0.3.0.tar.gz",
    "platform": null,
    "description": "# Mixture-Of-Agents Pack\n\nImplementation Of [Mixture-Of-Agents](https://arxiv.org/abs/2406.04692) paper from TogetherAI as LlamaPack.\n\nDisclaimer: While the paper named the method \"Mixture of Agents\", agents appear to refer to LLMs themselves, not actual agentic behaviour\n\n### Approach\n\nThe capabilities of LLMs have advanced significantly, and there is now a growing number of these models available. To maximize their potential, we need to harness the collective expertise of multiple LLMs. This is where the Mixture-of-Agents (MoA) approach comes in.\n\nThe MoA approach is a layered architecture where each layer consists of multiple LLM agents. These agents collaborate by taking the outputs of other agents in the previous layer as auxiliary information to generate their responses. This collaboration allows for the refinement and enhancement of responses, as agents build upon each other's strengths. The process can be categorized into two roles: Proposers(base LLM), who generate diverse context and perspectives, and Aggregators(reference LLMs), who synthesize these proposals into a single, high-quality output. By introducing additional aggregators and iteratively refining the responses, the MoA approach aims to maximize the collaborative potential of multiple LLMs, leading to superior outcomes.\n\n## CLI Usage\n\nYou can download llamapacks directly using `llamaindex-cli`, which comes installed with the `llama-index` python package:\n\n```bash\nllamaindex-cli download-llamapack MixtureOfAgentsPack --download-dir ./mixture_of_agents_pack\n```\n\nYou can then inspect the files at `./mixture_of_agents_pack` and use them as a template for your own project.\n\n## Code Usage\n\nYou can use LlamaPack in the following ways:\n\n1. Install the LlamaPack.\n2. Download the LlamaPack.\n\n### 1. Install the LlamaPack:\n\n```bash\npip install llama-index-packs-mixture-of-agents\n```\n\n### 2. Download LlamaPack:\n\nYou can download the pack to a the `./mixture_of_agents_pack` directory:\n\n```python\nfrom llama_index.core.llama_pack import download_llama_pack\n\n# download and install dependencies\nMixtureOfAgentsPack = download_llama_pack(\n    \"MixtureOfAgentsPack\", \"./mixture_of_agents_pack\"\n)\n```\n\nOnce installed or downloaded, you can use the LlamaPack as follows:\n\n```python\n# Necessary for async operations in Jupyter notebooks\nimport nest_asyncio\n\nnest_asyncio.apply()\n\nfrom llama_index.llms.openai import OpenAI\nfrom llama_index.llms.mistralai import MistralAI\n\n# Add OPENAI_API_KEY and MISTRAL_API_KEY to your env variable\n\nmixture_of_agents_pack = MixtureOfAgentsPack(\n    llm=OpenAI(model=\"gpt-4\"),  # Aggregator\n    reference_llms=[\n        OpenAI(model=\"gpt-3.5-turbo\"),\n        MistralAI(model=\"mistral-medium\"),\n    ],  # Proposers\n    num_layers=3,\n    temperature=0.1,\n    timeout=200,  # timeout for response from workflow\n)\n```\n\nFrom here, you can use the pack, or inspect and modify the pack in `./mixture_of_agents_pack`.\n\nThe `run()` function is a light wrapper around the proposed approach in the paper.\n\n```python\nresponse = mixture_of_agents_pack.run(\"What is LlamaIndex?\")\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index packs mixture_of_agents paper implementation",
    "version": "0.3.0",
    "project_urls": null,
    "split_keywords": [
        "agents",
        " llms",
        " query",
        " togetherai"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "27b0325924ee70c8efb7bd491cf58ba667e5f693207423490f22ea9cf646b96a",
                "md5": "3029da8b0da2dcf76a781ac05e373ae1",
                "sha256": "e0a44de37ddbf7873e4cbcef5cba00f3becc90e42f5d9c0b41a09b52e5e1620f"
            },
            "downloads": -1,
            "filename": "llama_index_packs_mixture_of_agents-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "3029da8b0da2dcf76a781ac05e373ae1",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 4925,
            "upload_time": "2024-11-17T22:42:30",
            "upload_time_iso_8601": "2024-11-17T22:42:30.485811Z",
            "url": "https://files.pythonhosted.org/packages/27/b0/325924ee70c8efb7bd491cf58ba667e5f693207423490f22ea9cf646b96a/llama_index_packs_mixture_of_agents-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a84fb4921b52deb89eabf15e7b4f131019fd0f195c7afae00968c23c2ff516c7",
                "md5": "ed725ec3a3120d71a22b794af8ad627f",
                "sha256": "f87d731e09cf41114b5ac64ee5bffe04123796b4bda0a4a3e83ee553739f38c6"
            },
            "downloads": -1,
            "filename": "llama_index_packs_mixture_of_agents-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "ed725ec3a3120d71a22b794af8ad627f",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 4623,
            "upload_time": "2024-11-17T22:42:32",
            "upload_time_iso_8601": "2024-11-17T22:42:32.064785Z",
            "url": "https://files.pythonhosted.org/packages/a8/4f/b4921b52deb89eabf15e7b4f131019fd0f195c7afae00968c23c2ff516c7/llama_index_packs_mixture_of_agents-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-11-17 22:42:32",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-packs-mixture-of-agents"
}
        
Elapsed time: 1.18070s