llama-index-llms-mistralai


Namellama-index-llms-mistralai JSON
Version 0.2.7 PyPI version JSON
download
home_pageNone
Summaryllama-index llms mistral ai integration
upload_time2024-10-16 16:28:03
maintainerNone
docs_urlNone
authorYour Name
requires_python<4.0,>=3.9
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LlamaIndex Llms Integration: Mistral

## Installation

Install the required packages using the following commands:

```bash
%pip install llama-index-llms-mistralai
!pip install llama-index
```

## Basic Usage

### Initialize the MistralAI Model

To use the MistralAI model, create an instance and provide your API key:

```python
from llama_index.llms.mistralai import MistralAI

llm = MistralAI(api_key="<replace-with-your-key>")
```

### Generate Completions

To generate a text completion for a prompt, use the `complete` method:

```python
resp = llm.complete("Paul Graham is ")
print(resp)
```

### Chat with the Model

You can also chat with the model using a list of messages. Here’s an example:

```python
from llama_index.core.llms import ChatMessage

messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = MistralAI().chat(messages)
print(resp)
```

### Using Random Seed

To set a random seed for reproducibility, initialize the model with the `random_seed` parameter:

```python
resp = MistralAI(random_seed=42).chat(messages)
print(resp)
```

## Streaming Responses

### Stream Completions

You can stream responses using the `stream_complete` method:

```python
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")
```

### Stream Chat Responses

To stream chat messages, use the following code:

```python
messages = [
    ChatMessage(role="system", content="You are CEO of MistralAI."),
    ChatMessage(role="user", content="Tell me the story about La plateforme"),
]
resp = llm.stream_chat(messages)
for r in resp:
    print(r.delta, end="")
```

## Configure Model

To use a specific model configuration, initialize the model with the desired model name:

```python
llm = MistralAI(model="mistral-medium")
resp = llm.stream_complete("Paul Graham is ")
for r in resp:
    print(r.delta, end="")
```

## Function Calling

You can call functions from the model by defining tools. Here’s an example:

```python
from llama_index.llms.mistralai import MistralAI
from llama_index.core.tools import FunctionTool


def multiply(a: int, b: int) -> int:
    """Multiply two integers and return the result."""
    return a * b


def mystery(a: int, b: int) -> int:
    """Mystery function on two integers."""
    return a * b + a + b


mystery_tool = FunctionTool.from_defaults(fn=mystery)
multiply_tool = FunctionTool.from_defaults(fn=multiply)

llm = MistralAI(model="mistral-large-latest")
response = llm.predict_and_call(
    [mystery_tool, multiply_tool],
    user_msg="What happens if I run the mystery function on 5 and 7",
)
print(str(response))
```

### LLM Implementation example

https://docs.llamaindex.ai/en/stable/examples/llm/mistralai/

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llama-index-llms-mistralai",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Your Name",
    "author_email": "you@example.com",
    "download_url": "https://files.pythonhosted.org/packages/67/51/ff7f5e118ce75e89f80e7ebf1f82276cf07febbde54f87294a2ae8b43cf1/llama_index_llms_mistralai-0.2.7.tar.gz",
    "platform": null,
    "description": "# LlamaIndex Llms Integration: Mistral\n\n## Installation\n\nInstall the required packages using the following commands:\n\n```bash\n%pip install llama-index-llms-mistralai\n!pip install llama-index\n```\n\n## Basic Usage\n\n### Initialize the MistralAI Model\n\nTo use the MistralAI model, create an instance and provide your API key:\n\n```python\nfrom llama_index.llms.mistralai import MistralAI\n\nllm = MistralAI(api_key=\"<replace-with-your-key>\")\n```\n\n### Generate Completions\n\nTo generate a text completion for a prompt, use the `complete` method:\n\n```python\nresp = llm.complete(\"Paul Graham is \")\nprint(resp)\n```\n\n### Chat with the Model\n\nYou can also chat with the model using a list of messages. Here\u2019s an example:\n\n```python\nfrom llama_index.core.llms import ChatMessage\n\nmessages = [\n    ChatMessage(role=\"system\", content=\"You are CEO of MistralAI.\"),\n    ChatMessage(role=\"user\", content=\"Tell me the story about La plateforme\"),\n]\nresp = MistralAI().chat(messages)\nprint(resp)\n```\n\n### Using Random Seed\n\nTo set a random seed for reproducibility, initialize the model with the `random_seed` parameter:\n\n```python\nresp = MistralAI(random_seed=42).chat(messages)\nprint(resp)\n```\n\n## Streaming Responses\n\n### Stream Completions\n\nYou can stream responses using the `stream_complete` method:\n\n```python\nresp = llm.stream_complete(\"Paul Graham is \")\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n### Stream Chat Responses\n\nTo stream chat messages, use the following code:\n\n```python\nmessages = [\n    ChatMessage(role=\"system\", content=\"You are CEO of MistralAI.\"),\n    ChatMessage(role=\"user\", content=\"Tell me the story about La plateforme\"),\n]\nresp = llm.stream_chat(messages)\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n## Configure Model\n\nTo use a specific model configuration, initialize the model with the desired model name:\n\n```python\nllm = MistralAI(model=\"mistral-medium\")\nresp = llm.stream_complete(\"Paul Graham is \")\nfor r in resp:\n    print(r.delta, end=\"\")\n```\n\n## Function Calling\n\nYou can call functions from the model by defining tools. Here\u2019s an example:\n\n```python\nfrom llama_index.llms.mistralai import MistralAI\nfrom llama_index.core.tools import FunctionTool\n\n\ndef multiply(a: int, b: int) -> int:\n    \"\"\"Multiply two integers and return the result.\"\"\"\n    return a * b\n\n\ndef mystery(a: int, b: int) -> int:\n    \"\"\"Mystery function on two integers.\"\"\"\n    return a * b + a + b\n\n\nmystery_tool = FunctionTool.from_defaults(fn=mystery)\nmultiply_tool = FunctionTool.from_defaults(fn=multiply)\n\nllm = MistralAI(model=\"mistral-large-latest\")\nresponse = llm.predict_and_call(\n    [mystery_tool, multiply_tool],\n    user_msg=\"What happens if I run the mystery function on 5 and 7\",\n)\nprint(str(response))\n```\n\n### LLM Implementation example\n\nhttps://docs.llamaindex.ai/en/stable/examples/llm/mistralai/\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "llama-index llms mistral ai integration",
    "version": "0.2.7",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "e010c058c5d990b58804f1ea30cf9c10343e7bb081dece96572ec50b85eb2d57",
                "md5": "52ab3d59947f5ea4aede94682b7ba85f",
                "sha256": "1ded64a0559485bb6f5f5364ee0e6cb058b736b7d74a78c65af1907105bcad98"
            },
            "downloads": -1,
            "filename": "llama_index_llms_mistralai-0.2.7-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "52ab3d59947f5ea4aede94682b7ba85f",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 6797,
            "upload_time": "2024-10-16T16:28:02",
            "upload_time_iso_8601": "2024-10-16T16:28:02.079861Z",
            "url": "https://files.pythonhosted.org/packages/e0/10/c058c5d990b58804f1ea30cf9c10343e7bb081dece96572ec50b85eb2d57/llama_index_llms_mistralai-0.2.7-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "6751ff7f5e118ce75e89f80e7ebf1f82276cf07febbde54f87294a2ae8b43cf1",
                "md5": "04661ff54d40bb607b81853081ddf871",
                "sha256": "6a84d17e79959ade460104cbc9d952dfd6a4fb8b8aa2c7151c6cd6c26245d636"
            },
            "downloads": -1,
            "filename": "llama_index_llms_mistralai-0.2.7.tar.gz",
            "has_sig": false,
            "md5_digest": "04661ff54d40bb607b81853081ddf871",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 6091,
            "upload_time": "2024-10-16T16:28:03",
            "upload_time_iso_8601": "2024-10-16T16:28:03.441920Z",
            "url": "https://files.pythonhosted.org/packages/67/51/ff7f5e118ce75e89f80e7ebf1f82276cf07febbde54f87294a2ae8b43cf1/llama_index_llms_mistralai-0.2.7.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-10-16 16:28:03",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llama-index-llms-mistralai"
}
        
Elapsed time: 0.36386s