Name | langchain-ollama JSON |
Version |
0.3.6
JSON |
| download |
home_page | None |
Summary | An integration package connecting Ollama and LangChain |
upload_time | 2025-07-22 17:26:59 |
maintainer | None |
docs_url | None |
author | None |
requires_python | >=3.9 |
license | MIT |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# langchain-ollama
This package contains the LangChain integration with Ollama
## Installation
```bash
pip install -U langchain-ollama
```
For the package to work, you will need to install and run the Ollama server locally ([download](https://ollama.com/download)).
To run integration tests (`make integration_tests`), you will need the following models installed in your Ollama server:
- `llama3.1`
- `deepseek-r1:1.5b`
Install these models by running:
```bash
ollama pull <name-of-model>
```
## [Chat Models](https://python.langchain.com/api_reference/ollama/chat_models/langchain_ollama.chat_models.ChatOllama.html#chatollama)
`ChatOllama` class exposes chat models from Ollama.
```python
from langchain_ollama import ChatOllama
llm = ChatOllama(model="llama3.1")
llm.invoke("Sing a ballad of LangChain.")
```
## [Embeddings](https://python.langchain.com/api_reference/ollama/embeddings/langchain_ollama.embeddings.OllamaEmbeddings.html#ollamaembeddings)
`OllamaEmbeddings` class exposes embeddings from Ollama.
```python
from langchain_ollama import OllamaEmbeddings
embeddings = OllamaEmbeddings(model="llama3.1")
embeddings.embed_query("What is the meaning of life?")
```
## [LLMs](https://python.langchain.com/api_reference/ollama/llms/langchain_ollama.llms.OllamaLLM.html#ollamallm)
`OllamaLLM` class exposes traditional LLMs from Ollama.
```python
from langchain_ollama import OllamaLLM
llm = OllamaLLM(model="llama3.1")
llm.invoke("The meaning of life is")
```
Raw data
{
"_id": null,
"home_page": null,
"name": "langchain-ollama",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": null,
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/82/67/93429a78d6fd40e2addf27e881db37e7f0076d712ffe9759ca0d5e10910e/langchain_ollama-0.3.6.tar.gz",
"platform": null,
"description": "# langchain-ollama\n\nThis package contains the LangChain integration with Ollama\n\n## Installation\n\n```bash\npip install -U langchain-ollama\n```\n\nFor the package to work, you will need to install and run the Ollama server locally ([download](https://ollama.com/download)).\n\nTo run integration tests (`make integration_tests`), you will need the following models installed in your Ollama server:\n\n- `llama3.1`\n- `deepseek-r1:1.5b`\n\nInstall these models by running:\n\n```bash\nollama pull <name-of-model>\n```\n\n## [Chat Models](https://python.langchain.com/api_reference/ollama/chat_models/langchain_ollama.chat_models.ChatOllama.html#chatollama)\n\n`ChatOllama` class exposes chat models from Ollama.\n\n```python\nfrom langchain_ollama import ChatOllama\n\nllm = ChatOllama(model=\"llama3.1\")\nllm.invoke(\"Sing a ballad of LangChain.\")\n```\n\n## [Embeddings](https://python.langchain.com/api_reference/ollama/embeddings/langchain_ollama.embeddings.OllamaEmbeddings.html#ollamaembeddings)\n\n`OllamaEmbeddings` class exposes embeddings from Ollama.\n\n```python\nfrom langchain_ollama import OllamaEmbeddings\n\nembeddings = OllamaEmbeddings(model=\"llama3.1\")\nembeddings.embed_query(\"What is the meaning of life?\")\n```\n\n## [LLMs](https://python.langchain.com/api_reference/ollama/llms/langchain_ollama.llms.OllamaLLM.html#ollamallm)\n\n`OllamaLLM` class exposes traditional LLMs from Ollama.\n\n```python\nfrom langchain_ollama import OllamaLLM\n\nllm = OllamaLLM(model=\"llama3.1\")\nllm.invoke(\"The meaning of life is\")\n```\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "An integration package connecting Ollama and LangChain",
"version": "0.3.6",
"project_urls": {
"Release Notes": "https://github.com/langchain-ai/langchain/releases?q=tag%3A%22langchain-ollama%3D%3D0%22&expanded=true",
"Source Code": "https://github.com/langchain-ai/langchain/tree/master/libs/partners/ollama",
"repository": "https://github.com/langchain-ai/langchain"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "f3c51e559f5b43d62850ea2b44097afc944f38894eac00e7feef3b42f0428916",
"md5": "e1de04764dcbbc972a4354a0095d41ea",
"sha256": "b339bd3fcf913b8d606ad426ef39e7122695532507fcd85aa96271b3f33dc3df"
},
"downloads": -1,
"filename": "langchain_ollama-0.3.6-py3-none-any.whl",
"has_sig": false,
"md5_digest": "e1de04764dcbbc972a4354a0095d41ea",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 24535,
"upload_time": "2025-07-22T17:26:58",
"upload_time_iso_8601": "2025-07-22T17:26:58.556000Z",
"url": "https://files.pythonhosted.org/packages/f3/c5/1e559f5b43d62850ea2b44097afc944f38894eac00e7feef3b42f0428916/langchain_ollama-0.3.6-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "826793429a78d6fd40e2addf27e881db37e7f0076d712ffe9759ca0d5e10910e",
"md5": "be109040179f2a98c434c4d8ed9dc883",
"sha256": "4270c4b30b3f3d10850cb9a1183b8c77d616195e0d9717ac745ef7f7f6cc2b6e"
},
"downloads": -1,
"filename": "langchain_ollama-0.3.6.tar.gz",
"has_sig": false,
"md5_digest": "be109040179f2a98c434c4d8ed9dc883",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 30479,
"upload_time": "2025-07-22T17:26:59",
"upload_time_iso_8601": "2025-07-22T17:26:59.605061Z",
"url": "https://files.pythonhosted.org/packages/82/67/93429a78d6fd40e2addf27e881db37e7f0076d712ffe9759ca0d5e10910e/langchain_ollama-0.3.6.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-22 17:26:59",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "langchain-ai",
"github_project": "langchain",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "langchain-ollama"
}