Name | langchain-mistralai JSON |
Version |
0.2.12
JSON |
| download |
home_page | None |
Summary | An integration package connecting Mistral and LangChain |
upload_time | 2025-09-18 15:47:40 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# langchain-mistralai
This package contains the LangChain integrations for [MistralAI](https://docs.mistral.ai) through their [mistralai](https://pypi.org/project/mistralai/) SDK.
## Installation
```bash
pip install -U langchain-mistralai
```
## Chat Models
This package contains the `ChatMistralAI` class, which is the recommended way to interface with MistralAI models.
To use, install the requirements, and configure your environment.
```bash
export MISTRAL_API_KEY=your-api-key
```
Then initialize
```python
from langchain_core.messages import HumanMessage
from langchain_mistralai.chat_models import ChatMistralAI
chat = ChatMistralAI(model="mistral-small")
messages = [HumanMessage(content="say a brief hello")]
chat.invoke(messages)
```
`ChatMistralAI` also supports async and streaming functionality:
```python
# For async...
await chat.ainvoke(messages)
# For streaming...
for chunk in chat.stream(messages):
print(chunk.content, end="", flush=True)
```
## Embeddings
With `MistralAIEmbeddings`, you can directly use the default model 'mistral-embed', or set a different one if available.
### Choose model
`embedding.model = 'mistral-embed'`
### Simple query
`res_query = embedding.embed_query("The test information")`
### Documents
`res_document = embedding.embed_documents(["test1", "another test"])`
Raw data
{
"_id": null,
"home_page": null,
"name": "langchain-mistralai",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/3d/b9/c6ee8f2383a63806d55e9426f02d26399dee3acff45c7e6ee04a156542a1/langchain_mistralai-0.2.12.tar.gz",
"platform": null,
"description": "# langchain-mistralai\n\nThis package contains the LangChain integrations for [MistralAI](https://docs.mistral.ai) through their [mistralai](https://pypi.org/project/mistralai/) SDK.\n\n## Installation\n\n```bash\npip install -U langchain-mistralai\n```\n\n## Chat Models\n\nThis package contains the `ChatMistralAI` class, which is the recommended way to interface with MistralAI models.\n\nTo use, install the requirements, and configure your environment.\n\n```bash\nexport MISTRAL_API_KEY=your-api-key\n```\n\nThen initialize\n\n```python\nfrom langchain_core.messages import HumanMessage\nfrom langchain_mistralai.chat_models import ChatMistralAI\n\nchat = ChatMistralAI(model=\"mistral-small\")\nmessages = [HumanMessage(content=\"say a brief hello\")]\nchat.invoke(messages)\n```\n\n`ChatMistralAI` also supports async and streaming functionality:\n\n```python\n# For async...\nawait chat.ainvoke(messages)\n\n# For streaming...\nfor chunk in chat.stream(messages):\n print(chunk.content, end=\"\", flush=True)\n```\n\n## Embeddings\n\nWith `MistralAIEmbeddings`, you can directly use the default model 'mistral-embed', or set a different one if available.\n\n### Choose model\n\n`embedding.model = 'mistral-embed'`\n\n### Simple query\n\n`res_query = embedding.embed_query(\"The test information\")`\n\n### Documents\n\n`res_document = embedding.embed_documents([\"test1\", \"another test\"])`\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An integration package connecting Mistral and LangChain",
"version": "0.2.12",
"project_urls": {
"Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-mistralai%3D%3D0%22&expanded=true",
"Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/partners/mistralai",
"repository": "https://github.com/langchain-ai/langchain"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "64fea4bf7240beb12ebaf9f1780938ec4402b40e7fa5ffcedc7c25473c2078ed",
"md5": "55abc4449ccf2d9e196a01c04b252aa7",
"sha256": "64a85947776017eec787b586f4bfa092d237c5e95a9ed719b5ff22a81747dedf"
},
"downloads": -1,
"filename": "langchain_mistralai-0.2.12-py3-none-any.whl",
"has_sig": false,
"md5_digest": "55abc4449ccf2d9e196a01c04b252aa7",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 16695,
"upload_time": "2025-09-18T15:47:39",
"upload_time_iso_8601": "2025-09-18T15:47:39.591119Z",
"url": "https://files.pythonhosted.org/packages/64/fe/a4bf7240beb12ebaf9f1780938ec4402b40e7fa5ffcedc7c25473c2078ed/langchain_mistralai-0.2.12-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "3db9c6ee8f2383a63806d55e9426f02d26399dee3acff45c7e6ee04a156542a1",
"md5": "f9002e6e5a8dd6e1971ff7c263678973",
"sha256": "c2ecd1460c48adbe497a2d3794052dfcc974a1280ceab4476047e62343d8bbc9"
},
"downloads": -1,
"filename": "langchain_mistralai-0.2.12.tar.gz",
"has_sig": false,
"md5_digest": "f9002e6e5a8dd6e1971ff7c263678973",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 22176,
"upload_time": "2025-09-18T15:47:40",
"upload_time_iso_8601": "2025-09-18T15:47:40.498920Z",
"url": "https://files.pythonhosted.org/packages/3d/b9/c6ee8f2383a63806d55e9426f02d26399dee3acff45c7e6ee04a156542a1/langchain_mistralai-0.2.12.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-09-18 15:47:40",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "langchain-ai",
"github_project": "langchain",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "langchain-mistralai"
}