# OpenAgentKit
[](https://pypi.org/project/openagentkit/0.1.0a13/)
[](https://opensource.org/licenses/Apache-2.0)
A comprehensive open-source toolkit for building agentic applications. OpenAgentKit provides a unified interface to work with various LLM providers, tools, and agent frameworks.
**WARNING**: Everything here is still in development, expect many bugs and unsupported features, please feel free to contribute!
## Features
- **Lightweight Structure**: Keeping core features of AI agents while still create rooms for custom extension without cluttering.
- **Unified LLM Interface**: Consistent API across multiple LLM providers by leveraging OpenAI APIs (will be extended in the future!)
- **Generator-based event stream**: Event-driven processing using a generator
- **Async Support**: Built-in asynchronous processing for high-performance applications
- **Tool Integration**: Pre-built tools for common agent tasks
- **Extensible Architecture**: Easily add custom models and tools
- **Type Safety**: Comprehensive typing support with Pydantic models
## Installation
```bash
pip install openagentkit==0.1.0a13
```
## Quick Start
```python
from openagentkit.modules.openai import OpenAIAgent
from openagentkit.core.tools.base_tool import tool
from pydantic import BaseModel
import openai
import os
import json
# Define a tool
@tool # Wrap the function in a tool decorator to automatically create a schema
def get_weather(city: str):
"""Get the weather of a city"""
# Actual implementation here...
# ...
return f"Weather in {city}: sunny, 20°C, feels like 22°C, humidity: 50%"
# Initialize OpenAI client
client = openai.OpenAI(
api_key=os.getenv("OPENAI_API_KEY"),
)
agent = OpenAIAgent(
client=client,
model="gpt-4o-mini",
system_message="""
You are a helpful assistant that can answer questions and help with tasks.
You are also able to use tools to get information.
""",
tools=[get_weather],
temperature=0.5,
max_tokens=100,
top_p=1.0,
)
generator = agent.execute(
messages=[
{"role": "user", "content": "What's the weather like in New York?"}
],
)
for response in generator:
print(response)
print(json.dumps(agent.get_history(), indent=2))
```
## Supported Integrations
- **LLM Providers**:
- OpenAI
- SmallestAI
- Azure OpenAI (via OpenAI integration)
- More coming soon!
- **Tools** *(Mostly for prototyping purposes)*:
- Weather information *(Requires WEATHERAPI_API_KEY)*
## Architecture
OpenAgentKit is built with a modular architecture:
- **Interfaces**: Abstract base classes defining the contract for all implementations
- **Models**: Pydantic models for type-safe data handling
- **Modules**: Implementation of various services and integrations
- **Handlers**: Processors for tools and other extensions
- **Utils**: Helper functions and utilities
## Advanced Usage
### Asynchronous Processing
```python
from openagentkit.modules.openai import OpenAIAgent
from openagentkit.core.tools.base_tool import tool
from pydantic import BaseModel
from typing import Annotated
import asyncio
import openai
import os
# Define an async tool
@tool # Wrap the function in a tool decorator to automatically create a schema
async def get_weather(city: str):
"""Get the weather of a city"""
# Actual implementation here...
# ...
return f"Weather in {city}: sunny, 20°C, feels like 22°C, humidity: 50%"
# Initialize OpenAI client
client = openai.AsyncOpenAI(
api_key=os.getenv("OPENAI_API_KEY"),
)
async def main():
# Initialize LLM service
agent = AsyncOpenAIAgent(
client=client,
model="gpt-4o-mini",
system_message="""
You are a helpful assistant that can answer questions and help with tasks.
You are also able to use tools to get information.
""",
tools=[get_weather],
temperature=0.5,
max_tokens=100,
top_p=1.0,
)
generator = agent.execute(
messages=[
{"role": "user", "content": "What's the weather like in New York?"}
]
)
async for response in generator:
print(response.content)
if __name__ == "__main__":
asyncio.run(main())
```
### Custom Tool Integration
#### Using the `@tool` decorator:
```python
from openagentkit.core.utils.tool_wrapper import tool
from pydantic import BaseModel
from typing import Annotated
# Define a tool
@tool # Wrap the function in a tool decorator to automatically create a schema
def get_weather(city: str):
"""Get the weather of a city""" # Always try to add pydoc in the function for better comprehension by LLM
# Actual implementation here...
# ...
return f"Weather in {city}: sunny, 20°C, feels like 22°C, humidity: 50%"
# Get the tool schema
print(get_weather.schema)
# Run the tool like any other function
weather_response = get_weather("Hanoi")
print(weather_response)
```
#### By subclassing Tool:
```python
from openagentkit.core.tools.base_tool import Tool
class GetWeather(Tool):
"""
A tool to get the current weather of a city.
"""
def __call__(self, city: str) -> str:
"""
Get the current weather in a city.
"""
# Simulate a weather API call
return f"The current weather in {city} is sunny with a temperature of 25°C."
get_weather = GetWeather()
# Get the tool schema
print(get_weather.schema)
# Run the tool like any other function
weather_response = get_weather("Hanoi")
print(weather_response)
```
### Custom Context Store
An Agent must have access to context (chat) history to be truly an agent. OpenAgentKit has a ContextStore module that supports various cache providers (Redis, Valkey) and a quick module for testing (InMemory)
```python
from openagentkit.modules.openai import AsyncOpenAIAgent
from openagentkit.core.context import InMemoryContextStore
import openai
import asyncio
from dotenv import load_dotenv
from pydantic import BaseModel
import os
load_dotenv()
context_store = InMemoryContextStore()
async def main():
client = openai.AsyncOpenAI(
api_key=os.getenv("OPENAI_API_KEY"),
)
# When initializing an agent, you can pass in a thread_id or agent_id as identifier for the default context scope. The 2 values are immutable for consistency.
agent = AsyncOpenAIAgent(
client=client,
system_message="You are a helpful assistant.",
context_store=context_store,
thread_id="test"
agent_id="AssistantA"
)
# Access the thread_id property
print(f"Thread ID: {agent.thread_id}")
async for event in agent.execute(
messages=[
{
"role": "user",
"content": "Hi, my name is John."
}
]
):
if event.content:
print(f"Response: {event.content}")
# If no thread_id is defined when executing the agent, it will defaults to the initialized thread_id attribute.
async for event in agent.execute(
messages=[
{
"role": "user",
"content": "What is my name?"
}
]
):
if event.content:
print(f"Response: {event.content}")
async for event in agent.execute(
messages=[
{
"role": "user",
"content": "What is my name?"
}
],
thread_id="new_context" # Since this is a new thread, the agent will no longer knowledge of the previous interaction
):
if event.content:
print(f"Response: {event.content}")
# If no thread_id is defined when executing the agent, it will defaults to the initialized thread_id attribute.
async for event in agent.execute(
messages=[
{
"role": "user",
"content": "Okay lovely, can you refer me to my name at the end of your sentence always?"
}
],
):
if event.content:
print(f"Response: {event.content}")
# Get Contexts related to agent instance (Agent ID)
print(context_store.get_agent_context(agent.agent_id))
if __name__ == "__main__":
asyncio.run(main())
# Get Context from thread_id
print(context_store.get_context("new_context"))
```
## Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add some amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request
## License
This project is licensed under the Apache License 2.0 - see the `LICENSE` file for details.
Raw data
{
"_id": null,
"home_page": null,
"name": "openagentkit",
"maintainer": null,
"docs_url": null,
"requires_python": ">=3.11",
"maintainer_email": null,
"keywords": "AI, agents, open-source, llm, tools, executors",
"author": null,
"author_email": "Kiet Do <kietdohuu@gmail.com>",
"download_url": "https://files.pythonhosted.org/packages/21/b1/94255f9bfa2c5d011ce1dc581cae125411e497d5867b6fce8cb4c0062720/openagentkit-0.1.0a13.tar.gz",
"platform": null,
"description": "# OpenAgentKit\n\n[](https://pypi.org/project/openagentkit/0.1.0a13/)\n[](https://opensource.org/licenses/Apache-2.0)\n\nA comprehensive open-source toolkit for building agentic applications. OpenAgentKit provides a unified interface to work with various LLM providers, tools, and agent frameworks.\n\n**WARNING**: Everything here is still in development, expect many bugs and unsupported features, please feel free to contribute! \n\n## Features\n\n- **Lightweight Structure**: Keeping core features of AI agents while still create rooms for custom extension without cluttering.\n- **Unified LLM Interface**: Consistent API across multiple LLM providers by leveraging OpenAI APIs (will be extended in the future!)\n- **Generator-based event stream**: Event-driven processing using a generator\n- **Async Support**: Built-in asynchronous processing for high-performance applications\n- **Tool Integration**: Pre-built tools for common agent tasks\n- **Extensible Architecture**: Easily add custom models and tools\n- **Type Safety**: Comprehensive typing support with Pydantic models\n\n## Installation\n\n```bash\npip install openagentkit==0.1.0a13\n```\n\n## Quick Start\n\n```python\nfrom openagentkit.modules.openai import OpenAIAgent\nfrom openagentkit.core.tools.base_tool import tool\nfrom pydantic import BaseModel\nimport openai\nimport os\nimport json\n\n# Define a tool\n@tool # Wrap the function in a tool decorator to automatically create a schema\ndef get_weather(city: str):\n \"\"\"Get the weather of a city\"\"\"\n\n # Actual implementation here...\n # ...\n\n return f\"Weather in {city}: sunny, 20\u00b0C, feels like 22\u00b0C, humidity: 50%\"\n\n# Initialize OpenAI client\nclient = openai.OpenAI(\n api_key=os.getenv(\"OPENAI_API_KEY\"),\n)\n\nagent = OpenAIAgent(\n client=client,\n model=\"gpt-4o-mini\",\n system_message=\"\"\"\n You are a helpful assistant that can answer questions and help with tasks.\n You are also able to use tools to get information.\n \"\"\",\n tools=[get_weather],\n temperature=0.5,\n max_tokens=100,\n top_p=1.0,\n)\n\ngenerator = agent.execute(\n messages=[\n {\"role\": \"user\", \"content\": \"What's the weather like in New York?\"}\n ],\n)\n\nfor response in generator:\n print(response)\n\nprint(json.dumps(agent.get_history(), indent=2))\n```\n\n## Supported Integrations\n\n- **LLM Providers**:\n\n - OpenAI\n - SmallestAI\n - Azure OpenAI (via OpenAI integration)\n - More coming soon!\n- **Tools** *(Mostly for prototyping purposes)*:\n\n - Weather information *(Requires WEATHERAPI_API_KEY)*\n\n## Architecture\n\nOpenAgentKit is built with a modular architecture:\n\n- **Interfaces**: Abstract base classes defining the contract for all implementations\n- **Models**: Pydantic models for type-safe data handling\n- **Modules**: Implementation of various services and integrations\n- **Handlers**: Processors for tools and other extensions\n- **Utils**: Helper functions and utilities\n\n## Advanced Usage\n\n### Asynchronous Processing\n\n```python\nfrom openagentkit.modules.openai import OpenAIAgent\nfrom openagentkit.core.tools.base_tool import tool\nfrom pydantic import BaseModel\nfrom typing import Annotated\nimport asyncio\nimport openai\nimport os\n\n# Define an async tool\n@tool # Wrap the function in a tool decorator to automatically create a schema\nasync def get_weather(city: str):\n \"\"\"Get the weather of a city\"\"\"\n\n # Actual implementation here...\n # ...\n\n return f\"Weather in {city}: sunny, 20\u00b0C, feels like 22\u00b0C, humidity: 50%\"\n\n# Initialize OpenAI client\nclient = openai.AsyncOpenAI(\n api_key=os.getenv(\"OPENAI_API_KEY\"),\n)\n\nasync def main():\n # Initialize LLM service\n agent = AsyncOpenAIAgent(\n client=client,\n model=\"gpt-4o-mini\",\n system_message=\"\"\"\n You are a helpful assistant that can answer questions and help with tasks.\n You are also able to use tools to get information.\n \"\"\",\n tools=[get_weather],\n temperature=0.5,\n max_tokens=100,\n top_p=1.0,\n )\n\n generator = agent.execute(\n messages=[\n {\"role\": \"user\", \"content\": \"What's the weather like in New York?\"}\n ]\n )\n\n async for response in generator:\n print(response.content)\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n```\n\n### Custom Tool Integration\n\n#### Using the `@tool` decorator:\n\n```python\nfrom openagentkit.core.utils.tool_wrapper import tool\nfrom pydantic import BaseModel\nfrom typing import Annotated\n\n# Define a tool\n@tool # Wrap the function in a tool decorator to automatically create a schema\ndef get_weather(city: str):\n \"\"\"Get the weather of a city\"\"\" # Always try to add pydoc in the function for better comprehension by LLM \n\n # Actual implementation here...\n # ...\n\n return f\"Weather in {city}: sunny, 20\u00b0C, feels like 22\u00b0C, humidity: 50%\"\n\n# Get the tool schema\nprint(get_weather.schema)\n\n# Run the tool like any other function\nweather_response = get_weather(\"Hanoi\")\nprint(weather_response) \n```\n\n#### By subclassing Tool:\n\n```python\nfrom openagentkit.core.tools.base_tool import Tool\n\nclass GetWeather(Tool):\n \"\"\"\n A tool to get the current weather of a city.\n \"\"\"\n def __call__(self, city: str) -> str:\n \"\"\"\n Get the current weather in a city.\n \"\"\"\n # Simulate a weather API call\n return f\"The current weather in {city} is sunny with a temperature of 25\u00b0C.\"\n \nget_weather = GetWeather()\n\n# Get the tool schema\nprint(get_weather.schema)\n\n# Run the tool like any other function\nweather_response = get_weather(\"Hanoi\")\nprint(weather_response) \n```\n\n### Custom Context Store\n\nAn Agent must have access to context (chat) history to be truly an agent. OpenAgentKit has a ContextStore module that supports various cache providers (Redis, Valkey) and a quick module for testing (InMemory)\n\n```python\nfrom openagentkit.modules.openai import AsyncOpenAIAgent\nfrom openagentkit.core.context import InMemoryContextStore\nimport openai\nimport asyncio\nfrom dotenv import load_dotenv\nfrom pydantic import BaseModel\nimport os\n\nload_dotenv()\n\ncontext_store = InMemoryContextStore()\n\nasync def main():\n client = openai.AsyncOpenAI(\n api_key=os.getenv(\"OPENAI_API_KEY\"),\n )\n\n # When initializing an agent, you can pass in a thread_id or agent_id as identifier for the default context scope. The 2 values are immutable for consistency.\n agent = AsyncOpenAIAgent(\n client=client,\n system_message=\"You are a helpful assistant.\",\n context_store=context_store,\n thread_id=\"test\"\n agent_id=\"AssistantA\"\n )\n\n # Access the thread_id property\n print(f\"Thread ID: {agent.thread_id}\")\n\n async for event in agent.execute(\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"Hi, my name is John.\"\n }\n ]\n ):\n if event.content:\n print(f\"Response: {event.content}\")\n\n # If no thread_id is defined when executing the agent, it will defaults to the initialized thread_id attribute.\n async for event in agent.execute(\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"What is my name?\"\n }\n ]\n ):\n if event.content:\n print(f\"Response: {event.content}\")\n\n async for event in agent.execute(\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"What is my name?\"\n }\n ],\n thread_id=\"new_context\" # Since this is a new thread, the agent will no longer knowledge of the previous interaction\n ):\n if event.content:\n print(f\"Response: {event.content}\")\n\n # If no thread_id is defined when executing the agent, it will defaults to the initialized thread_id attribute.\n async for event in agent.execute(\n messages=[\n {\n \"role\": \"user\",\n \"content\": \"Okay lovely, can you refer me to my name at the end of your sentence always?\"\n }\n ],\n ):\n if event.content:\n print(f\"Response: {event.content}\")\n\n # Get Contexts related to agent instance (Agent ID)\n print(context_store.get_agent_context(agent.agent_id))\n\nif __name__ == \"__main__\":\n asyncio.run(main())\n # Get Context from thread_id\n print(context_store.get_context(\"new_context\"))\n```\n\n## Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n1. Fork the repository\n2. Create your feature branch (`git checkout -b feature/amazing-feature`)\n3. Commit your changes (`git commit -m 'Add some amazing feature'`)\n4. Push to the branch (`git push origin feature/amazing-feature`)\n5. Open a Pull Request\n\n## License\n\nThis project is licensed under the Apache License 2.0 - see the `LICENSE` file for details.\n",
"bugtrack_url": null,
"license": null,
"summary": "An open-source framework for building and deploying AI agents.",
"version": "0.1.0a13",
"project_urls": {
"Bug Reports": "https://github.com/JustKiet/openagentkit/issues",
"Documentation": "https://github.com/JustKiet/openagentkit#readme",
"Homepage": "https://github.com/JustKiet/openagentkit",
"Source": "https://github.com/JustKiet/openagentkit"
},
"split_keywords": [
"ai",
" agents",
" open-source",
" llm",
" tools",
" executors"
],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "5819b7e9131024ec0d496ca8eab38b52b45a67422ababdcbaaf556ecf6474422",
"md5": "226e82fc1668c21dde7fddb2f34a981a",
"sha256": "12fee4aafc6477e66c8b714ce3a85ab6236bda63732f00e4516ce292bcac73f1"
},
"downloads": -1,
"filename": "openagentkit-0.1.0a13-py3-none-any.whl",
"has_sig": false,
"md5_digest": "226e82fc1668c21dde7fddb2f34a981a",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": ">=3.11",
"size": 82346,
"upload_time": "2025-07-09T10:11:04",
"upload_time_iso_8601": "2025-07-09T10:11:04.870148Z",
"url": "https://files.pythonhosted.org/packages/58/19/b7e9131024ec0d496ca8eab38b52b45a67422ababdcbaaf556ecf6474422/openagentkit-0.1.0a13-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "21b194255f9bfa2c5d011ce1dc581cae125411e497d5867b6fce8cb4c0062720",
"md5": "0b58b11eaf9ae35005e7f610bd9d4a2d",
"sha256": "772d8a509b93e43ee5889d72be75e9b6f2eca6024860e170ba6d2f0222b233f0"
},
"downloads": -1,
"filename": "openagentkit-0.1.0a13.tar.gz",
"has_sig": false,
"md5_digest": "0b58b11eaf9ae35005e7f610bd9d4a2d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": ">=3.11",
"size": 55747,
"upload_time": "2025-07-09T10:11:06",
"upload_time_iso_8601": "2025-07-09T10:11:06.420234Z",
"url": "https://files.pythonhosted.org/packages/21/b1/94255f9bfa2c5d011ce1dc581cae125411e497d5867b6fce8cb4c0062720/openagentkit-0.1.0a13.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-09 10:11:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "JustKiet",
"github_project": "openagentkit",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"lcname": "openagentkit"
}