# langchain-sarvam
Integration package connecting Sarvam AI chat completions with LangChain.
## Installation
with `uv` inside the package:
```bash
uv pip install langchain-sarvam
```
## Setup
```python
# Set the SARVAM API key
sarvam_Api_key = os.getenv("SARVAM_API_KEY")
```
## Usage
### Basic Usage
```python
from langchain_sarvam import ChatSarvam
llm = ChatSarvam(model="sarvam-m", temperature=0.2, max_tokens=128)
resp = llm.invoke([("system", "You are helpful"), ("human", "Hello!")])
print(resp.content)
```
### Language-Specific Usage
```python
from langchain_sarvam import ChatSarvam
llm = ChatSarvam(
model="sarvam-m",
temperature=0.7,
sarvam_api_key=os.getenv("SARVAM_API_KEY")
)
response = llm.invoke([
("system", "talk in Hindi"),
("human", "what is color of sky?"),
])
print(response.content) # Output: आसमान का रंग नीला होता है...
```
### Advanced Content Generation
```python
from langchain_sarvam import ChatSarvam
llm = ChatSarvam(model="sarvam-m")
# Generate blog post outline
response = llm.invoke("create the outline for the blog post outline for blog topic - AI engineering.")
print(response.content)
```
### Batch Processing
```python
from langchain_sarvam import ChatSarvam
from langchain_core.messages import HumanMessage
chat = ChatSarvam(model="sarvam-m")
# Batch processing - use list of message lists
messages = [
[HumanMessage(content="Tell me a joke")],
[HumanMessage(content="What's the weather like?")]
]
responses = chat.batch(messages)
for response in responses:
print(response.content)
```
### Using generate() Method
```python
from langchain_sarvam import ChatSarvam
from langchain_core.messages import HumanMessage
chat = ChatSarvam(model="sarvam-m")
# generate() expects a list of message lists
inputs = [
[HumanMessage(content="Tell me a joke with emojis only")],
[HumanMessage(content="What's the weather like?")]
]
result = chat.generate(inputs)
for generation_list in result.generations:
# generation_list is a list of ChatGeneration objects
for generation in generation_list:
print(generation.message.content)
```
### Streaming
```python
for chunk in ChatSarvam(model="sarvam-m", streaming=True).stream("Tell me a joke"):
print(chunk.text, end="")
```
Raw data
{
"_id": null,
"home_page": null,
"name": "langchain-sarvam",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.9",
"maintainer_email": null,
"keywords": "ai, chat, langchain, llm, sarvam",
"author": null,
"author_email": "Parth1609 <parthgajananpatil@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/cc/ed/92ac1cfc30a3b52927b9010dbba8e87fe90b58dfc8d21c2283e56e382605/langchain_sarvam-0.1.1.tar.gz",
"platform": null,
"description": "# langchain-sarvam\n\nIntegration package connecting Sarvam AI chat completions with LangChain.\n\n## Installation\n\n\nwith `uv` inside the package:\n\n```bash\nuv pip install langchain-sarvam\n```\n\n## Setup\n\n```python\n# Set the SARVAM API key\nsarvam_Api_key = os.getenv(\"SARVAM_API_KEY\")\n```\n\n## Usage\n### Basic Usage\n\n```python\nfrom langchain_sarvam import ChatSarvam\n\nllm = ChatSarvam(model=\"sarvam-m\", temperature=0.2, max_tokens=128)\nresp = llm.invoke([(\"system\", \"You are helpful\"), (\"human\", \"Hello!\")])\nprint(resp.content)\n```\n\n### Language-Specific Usage\n\n```python\nfrom langchain_sarvam import ChatSarvam\n\nllm = ChatSarvam(\n model=\"sarvam-m\",\n temperature=0.7,\n sarvam_api_key=os.getenv(\"SARVAM_API_KEY\")\n)\n\nresponse = llm.invoke([\n (\"system\", \"talk in Hindi\"),\n (\"human\", \"what is color of sky?\"),\n])\nprint(response.content) # Output: \u0906\u0938\u092e\u093e\u0928 \u0915\u093e \u0930\u0902\u0917 \u0928\u0940\u0932\u093e \u0939\u094b\u0924\u093e \u0939\u0948...\n```\n\n### Advanced Content Generation\n\n```python\nfrom langchain_sarvam import ChatSarvam\n\nllm = ChatSarvam(model=\"sarvam-m\")\n\n# Generate blog post outline\nresponse = llm.invoke(\"create the outline for the blog post outline for blog topic - AI engineering.\")\nprint(response.content)\n```\n\n### Batch Processing\n\n```python\nfrom langchain_sarvam import ChatSarvam\nfrom langchain_core.messages import HumanMessage\n\nchat = ChatSarvam(model=\"sarvam-m\")\n\n# Batch processing - use list of message lists\nmessages = [\n [HumanMessage(content=\"Tell me a joke\")],\n [HumanMessage(content=\"What's the weather like?\")]\n]\n\nresponses = chat.batch(messages)\nfor response in responses:\n print(response.content)\n```\n\n### Using generate() Method\n\n```python\nfrom langchain_sarvam import ChatSarvam\nfrom langchain_core.messages import HumanMessage\n\nchat = ChatSarvam(model=\"sarvam-m\")\n\n# generate() expects a list of message lists\ninputs = [\n [HumanMessage(content=\"Tell me a joke with emojis only\")],\n [HumanMessage(content=\"What's the weather like?\")]\n]\n\nresult = chat.generate(inputs)\nfor generation_list in result.generations:\n # generation_list is a list of ChatGeneration objects\n for generation in generation_list:\n print(generation.message.content)\n```\n\n\n### Streaming\n\n```python\nfor chunk in ChatSarvam(model=\"sarvam-m\", streaming=True).stream(\"Tell me a joke\"):\n print(chunk.text, end=\"\")\n```\n ",
"bugtrack_url": null,
"license": null,
"summary": "An integration package connecting sarvam-AI and LangChain",
"version": "0.1.1",
"project_urls": {
"Homepage": "https://github.com/parth1609/langchain_sarvam",
"Repository": "https://github.com/parth1609/langchain_sarvam"
},
"split_keywords": [
"ai",
" chat",
" langchain",
" llm",
" sarvam"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "c383679f3d183117090a81eee5080faf5e99022b6f972eb1c92afff5f747a392",
"md5": "976b3ed2d42893b3dfa65ee3ed765885",
"sha256": "19e9853d7e81efb6b55624f315ebc384ebc10c0f2f6b1bfee4893d4361af6869"
},
"downloads": -1,
"filename": "langchain_sarvam-0.1.1-py3-none-any.whl",
"has_sig": false,
"md5_digest": "976b3ed2d42893b3dfa65ee3ed765885",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.9",
"size": 7573,
"upload_time": "2025-10-19T06:43:32",
"upload_time_iso_8601": "2025-10-19T06:43:32.353732Z",
"url": "https://files.pythonhosted.org/packages/c3/83/679f3d183117090a81eee5080faf5e99022b6f972eb1c92afff5f747a392/langchain_sarvam-0.1.1-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "cced92ac1cfc30a3b52927b9010dbba8e87fe90b58dfc8d21c2283e56e382605",
"md5": "0e7736ed9740f6cb7eff1e681974abbe",
"sha256": "a6075515bd50491aa847e984d6af4b2a49fe6a45426b0b72e626e230cdf1af17"
},
"downloads": -1,
"filename": "langchain_sarvam-0.1.1.tar.gz",
"has_sig": false,
"md5_digest": "0e7736ed9740f6cb7eff1e681974abbe",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.9",
"size": 85608,
"upload_time": "2025-10-19T06:43:34",
"upload_time_iso_8601": "2025-10-19T06:43:34.486615Z",
"url": "https://files.pythonhosted.org/packages/cc/ed/92ac1cfc30a3b52927b9010dbba8e87fe90b58dfc8d21c2283e56e382605/langchain_sarvam-0.1.1.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-10-19 06:43:34",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "parth1609",
"github_project": "langchain_sarvam",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"requirements": [
{
"name": "langchain-core",
"specs": []
},
{
"name": "python-dotenv",
"specs": []
},
{
"name": "sarvamai",
"specs": []
}
],
"lcname": "langchain-sarvam"
}