# ThinAgents
A lightweight, pluggable AI Agent framework for Python.
Build LLM-powered agents that can use tools, remember conversations, and connect to external resources with minimal code. ThinAgents leverages `litellm` under the hood for its language model interactions.
[Docs](https://thinagents.vercel.app/)
---
## Installation
```bash
pip install thinagents
```
---
## Basic Usage
Create an agent and interact with an LLM in just a few lines:
```python
from thinagents import Agent
agent = Agent(
name="Greeting Agent",
model="openai/gpt-4o-mini",
)
response = await agent.arun("Hello, how are you?")
print(response.content)
```
---
## Tools
Agents can use Python functions as tools to perform actions or fetch data.
```python
from thinagents import Agent
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny."
agent = Agent(
name="Weather Agent",
model="openai/gpt-4o-mini",
tools=[get_weather],
)
response = await agent.arun("What is the weather in Tokyo?")
print(response.content)
```
---
## Tools with Decorator
For richer metadata and parameter validation, use the `@tool` decorator:
```python
from thinagents import Agent, tool
@tool(name="get_weather")
def get_weather(city: str) -> str:
"""Get the weather for a city."""
return f"The weather in {city} is sunny."
agent = Agent(
name="Weather Pro",
model="openai/gpt-4o-mini",
tools=[get_weather],
)
```
You can also use Pydantic models for parameter schemas:
```python
from pydantic import BaseModel, Field
from thinagents import tool
class MultiplyInputSchema(BaseModel):
a: int = Field(description="First operand")
b: int = Field(description="Second operand")
@tool(name="multiply_tool", pydantic_schema=MultiplyInputSchema)
def multiply(a: int, b: int) -> int:
return a * b
```
---
## Returning Content and Artifact
Sometimes, a tool should return both a summary (for the LLM) and a large artifact (for downstream use):
```python
from thinagents import tool
@tool(return_type="content_and_artifact")
def summarize_and_return_data(query: str) -> tuple[str, dict]:
data = {"rows": list(range(10000))}
summary = f"Found {len(data['rows'])} rows for query: {query}"
return summary, data
response = await agent.arun("Summarize the data for X")
print(response.content) # Sent to LLM
print(response.artifact) # Available for downstream use
```
---
## Async Usage
ThinAgents is async by design. You can stream responses or await the full result:
```python
# Streaming
async for chunk in agent.astream("List files and get weather", conversation_id="1"):
print(chunk.content, end="", flush=True)
# Or get the full response at once (non-streaming)
response = await agent.arun("List files and get weather", conversation_id="1")
print(response.content)
```
---
## Memory
Agents can remember previous messages and tool results by attaching a memory backend.
```python
from thinagents.memory import InMemoryStore
agent = Agent(
name="Memory Demo",
model="openai/gpt-4o-mini",
memory=InMemoryStore(), # Fast, in-memory storage
)
conv_id = "demo-1"
print(await agent.arun("Hi, I'm Alice!", conversation_id=conv_id))
print(await agent.arun("What is my name?", conversation_id=conv_id))
# → "Your name is Alice."
```
### Persistent Memory
```python
from thinagents.memory import FileMemory, SQLiteMemory
file_agent = Agent(
name="File Mem Agent",
model="openai/gpt-4o-mini",
memory=FileMemory(storage_dir="./agent_mem"),
)
db_agent = Agent(
name="SQLite Mem Agent",
model="openai/gpt-4o-mini",
memory=SQLiteMemory(db_path="./agent_mem.db"),
)
```
#### Storing Tool Artifacts
Enable artifact storage in memory:
```python
agent = Agent(
...,
memory=InMemoryStore(store_tool_artifacts=True),
)
```
---
## Model Context Protocol (MCP) Integration
Connect your agent to external resources (files, APIs, etc.) using MCP.
```python
agent = Agent(
name="MCP Agent",
model="openai/gpt-4o-mini",
mcp_servers=[
{
"transport": "sse",
"url": "http://localhost:8100/sse"
},
{
"transport": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/path/to/dir"
]
},
],
)
```
---
## License
MIT
Raw data
{
"_id": null,
"home_page": "https://github.com/PrabhuKiran8790/thinagents",
"name": "thinagents",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.10",
"maintainer_email": null,
"keywords": "AI, LLM, Agentic AI, AI Agents",
"author": "Prabhu Kiran Konda",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/5b/b4/d6106838595c1931a50ada4df231a2385b003684f7dee3cb846b643660ca/thinagents-0.0.10.tar.gz",
"platform": null,
"description": "# ThinAgents\n\nA lightweight, pluggable AI Agent framework for Python. \nBuild LLM-powered agents that can use tools, remember conversations, and connect to external resources with minimal code. ThinAgents leverages `litellm` under the hood for its language model interactions.\n\n[Docs](https://thinagents.vercel.app/)\n\n---\n\n## Installation\n\n```bash\npip install thinagents\n```\n\n---\n\n## Basic Usage\n\nCreate an agent and interact with an LLM in just a few lines:\n\n```python\nfrom thinagents import Agent\n\nagent = Agent(\n name=\"Greeting Agent\",\n model=\"openai/gpt-4o-mini\",\n)\n\nresponse = await agent.arun(\"Hello, how are you?\")\nprint(response.content)\n```\n\n---\n\n## Tools\n\nAgents can use Python functions as tools to perform actions or fetch data.\n\n```python\nfrom thinagents import Agent\n\ndef get_weather(city: str) -> str:\n return f\"The weather in {city} is sunny.\"\n\nagent = Agent(\n name=\"Weather Agent\",\n model=\"openai/gpt-4o-mini\",\n tools=[get_weather],\n)\n\nresponse = await agent.arun(\"What is the weather in Tokyo?\")\nprint(response.content)\n```\n\n---\n\n## Tools with Decorator\n\nFor richer metadata and parameter validation, use the `@tool` decorator:\n\n```python\nfrom thinagents import Agent, tool\n\n@tool(name=\"get_weather\")\ndef get_weather(city: str) -> str:\n \"\"\"Get the weather for a city.\"\"\"\n return f\"The weather in {city} is sunny.\"\n\nagent = Agent(\n name=\"Weather Pro\",\n model=\"openai/gpt-4o-mini\",\n tools=[get_weather],\n)\n```\n\nYou can also use Pydantic models for parameter schemas:\n\n```python\nfrom pydantic import BaseModel, Field\nfrom thinagents import tool\n\nclass MultiplyInputSchema(BaseModel):\n a: int = Field(description=\"First operand\")\n b: int = Field(description=\"Second operand\")\n\n@tool(name=\"multiply_tool\", pydantic_schema=MultiplyInputSchema)\ndef multiply(a: int, b: int) -> int:\n return a * b\n```\n\n---\n\n## Returning Content and Artifact\n\nSometimes, a tool should return both a summary (for the LLM) and a large artifact (for downstream use):\n\n```python\nfrom thinagents import tool\n\n@tool(return_type=\"content_and_artifact\")\ndef summarize_and_return_data(query: str) -> tuple[str, dict]:\n data = {\"rows\": list(range(10000))}\n summary = f\"Found {len(data['rows'])} rows for query: {query}\"\n return summary, data\n\nresponse = await agent.arun(\"Summarize the data for X\")\nprint(response.content) # Sent to LLM\nprint(response.artifact) # Available for downstream use\n```\n\n---\n\n## Async Usage\n\nThinAgents is async by design. You can stream responses or await the full result:\n\n```python\n# Streaming\nasync for chunk in agent.astream(\"List files and get weather\", conversation_id=\"1\"):\n print(chunk.content, end=\"\", flush=True)\n\n# Or get the full response at once (non-streaming)\nresponse = await agent.arun(\"List files and get weather\", conversation_id=\"1\")\nprint(response.content)\n```\n\n---\n\n## Memory\n\nAgents can remember previous messages and tool results by attaching a memory backend.\n\n```python\nfrom thinagents.memory import InMemoryStore\n\nagent = Agent(\n name=\"Memory Demo\",\n model=\"openai/gpt-4o-mini\",\n memory=InMemoryStore(), # Fast, in-memory storage\n)\n\nconv_id = \"demo-1\"\nprint(await agent.arun(\"Hi, I'm Alice!\", conversation_id=conv_id))\nprint(await agent.arun(\"What is my name?\", conversation_id=conv_id))\n# \u2192 \"Your name is Alice.\"\n```\n\n### Persistent Memory\n\n```python\nfrom thinagents.memory import FileMemory, SQLiteMemory\n\nfile_agent = Agent(\n name=\"File Mem Agent\",\n model=\"openai/gpt-4o-mini\",\n memory=FileMemory(storage_dir=\"./agent_mem\"),\n)\n\ndb_agent = Agent(\n name=\"SQLite Mem Agent\",\n model=\"openai/gpt-4o-mini\",\n memory=SQLiteMemory(db_path=\"./agent_mem.db\"),\n)\n```\n\n#### Storing Tool Artifacts\n\nEnable artifact storage in memory:\n\n```python\nagent = Agent(\n ...,\n memory=InMemoryStore(store_tool_artifacts=True),\n)\n```\n\n---\n\n## Model Context Protocol (MCP) Integration\n\nConnect your agent to external resources (files, APIs, etc.) using MCP.\n\n```python\nagent = Agent(\n name=\"MCP Agent\",\n model=\"openai/gpt-4o-mini\",\n mcp_servers=[\n {\n \"transport\": \"sse\",\n \"url\": \"http://localhost:8100/sse\"\n },\n {\n \"transport\": \"stdio\",\n \"command\": \"npx\",\n \"args\": [\n \"-y\",\n \"@modelcontextprotocol/server-filesystem\",\n \"/path/to/dir\"\n ]\n },\n ],\n)\n```\n\n---\n\n## License\n\nMIT\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "A lightweight AI Agent framework",
"version": "0.0.10",
"project_urls": {
"Homepage": "https://github.com/PrabhuKiran8790/thinagents"
},
"split_keywords": [
"ai",
" llm",
" agentic ai",
" ai agents"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "2703396978fe109c49d6c7b1ce4a8b179fd8992cd391a6d434e2f63ffe62e8b1",
"md5": "402940b0fde8065f5c4dba26208e5bcd",
"sha256": "67f2aeb321e9361637a44ad3387ba91268c6cdec743e1bf56af0832926ef6466"
},
"downloads": -1,
"filename": "thinagents-0.0.10-py3-none-any.whl",
"has_sig": false,
"md5_digest": "402940b0fde8065f5c4dba26208e5bcd",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.10",
"size": 59246,
"upload_time": "2025-07-13T17:36:41",
"upload_time_iso_8601": "2025-07-13T17:36:41.225857Z",
"url": "https://files.pythonhosted.org/packages/27/03/396978fe109c49d6c7b1ce4a8b179fd8992cd391a6d434e2f63ffe62e8b1/thinagents-0.0.10-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "5bb4d6106838595c1931a50ada4df231a2385b003684f7dee3cb846b643660ca",
"md5": "b16efcd392fd483a5d0c28860f143e54",
"sha256": "67405f81d52bc83423dea0af8b15367941eb5c6d5515f56027ce4a3aa3b3e0be"
},
"downloads": -1,
"filename": "thinagents-0.0.10.tar.gz",
"has_sig": false,
"md5_digest": "b16efcd392fd483a5d0c28860f143e54",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.10",
"size": 52117,
"upload_time": "2025-07-13T17:36:42",
"upload_time_iso_8601": "2025-07-13T17:36:42.079388Z",
"url": "https://files.pythonhosted.org/packages/5b/b4/d6106838595c1931a50ada4df231a2385b003684f7dee3cb846b643660ca/thinagents-0.0.10.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-13 17:36:42",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "PrabhuKiran8790",
"github_project": "thinagents",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "thinagents"
}