<div align="center"><picture>
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/Anilturaga/aiide/blob/main/assets/figures/logo_dark.svg?raw=True">
<img alt="aiide" src="https://github.com/Anilturaga/aiide/blob/main/assets/figures/logo.svg?raw=True" width=200">
</picture></div>
<br/>
aiide is a framework to build LLM copilots.
It is born out of 3 years of experience building LLM applications starting from GPT-3 completion models to the latest frontier chat models.
| What you get with aiide? | What's not part of aiide? |
|--------------------------|--------------------------|
| Full control over content sent to the LLM | Verbose abstractions for common prompting techniques |
| Tools and structured outputs are first class citizens for actions and content generation | Chains as a core building block |
| Simplified streaming by default to build real-time apps | Output parsing tools |
| Messages history is a Pandas DataFrame | Complex nested JSON objects |
## Table of Contents
* [Installation](#installation)
* [Chat](#chat)
* [Memory](#memory)
* [Structured Outputs](#structured-outputs)
* [Tools](#tools)
* [JSON Schema](#json-schema)
* [Misc](#misc)
## Installation
Let's start by installing the package.
```bash
pip install aiide
```
This also installs LiteLM and Pandas by default. If you would like to use other LLM providers such as Anthropic or Google AI, please install the respective SDK as well.
The whole tutorial uses OpenAI models but it should work with all the popular LLM providers.
## Chat
Now that aiide is installed, let's create a simple chatbot similar to ChatGPT free tier.
```python
from aiide import Aiide
class Chatbot(Aiide):
def __init__(self):
self.setup(system_message="You are a helpful assistant.", model="gpt-4o-mini-2024-07-18")
agent = Chatbot()
while True:
user_input = input("\nSend a message: ")
if user_input == "exit":
break
for delta in agent.chat(user_message=user_input):
if delta["type"] == "text":
print(delta["delta"],end="")
```
Let's break down the code:
- We defined a class Chatbot that inherits from Aiide. Classes are a great way to encapsulate the chatbot logic and state. It also enables sharing state with tools and structured outputs as we will see later.
- We set up the chatbot with a system message and a model. The model can be anything supported by LiteLM.
- We then create an instance of the Chatbot and start a loop to chat with the bot.
- The chat loop returns a generator of deltas. Each delta has a type and content. The type can be either text, tool_call or tool_response. If the delta type is text, it will have delta and content as it's keys. If the delta type is tool_call, it will have name and arguments as it's keys. If the delta type is tool_response, it will have name, arguments and response as it's keys.
#### User Message Input
`user_message` can take a couple of types of inputs. It can be a string as you've just seen, it can be an image object(`PIL.Image`) it can be an array of strings and images.
Different ways to use user_message:
```python
user_message = "What's the weather like in SF"
image = Image.open("image.jpg")
user_message = image
user_message = [image, "Annotate the attached image"]
```
<!--
user_message = {"RAG":"Some large content","query":"Actual user message"}
Reasoning for using dict as input:
When you pass a dict for user_message in aiide, it will only pass in the values to the LLM. The keys are instead useful to later update or remove pieces of information from the memory/chat history.
-->
## Memory
A natural question for the above snippet is how do we track the chat history?
aiide has first-class support for memory. I found that handling OpenAI JSON based schema is cumbersome and error-prone. So, I had abstracted the chat history into a Pandas DataFrame.
`messages` is a pandas DataFrame that stores all the messages, tool calls and responses of a chat session.
You can access the memory of the chatbot by calling `agent.messages` in the above example.
The schema of the messages DataFrame is as follows:
| role | content | arguments | response |
|-------------|-----------|-------------| ------------ |
| system | You are a helpful assistant | None | None |
| user | What's the weather like in SF | None | None |
| tool | {'name':'get_weather','id':'abc'} | {'location':'SF'} | {'temperature':'72'} |
| assistant | The weather is 72 degress right now. | None | None |
You can use the memory DataFrame to analyze and manipulate the chat history and the tool calls and responses.
## Structured Outputs
Currently the LLM can respond with text in any format. Sometimes it thinks first, sometimes it will answer in code right away. What if we want to structure the output in a specific way?
Currently to my knowledge, OpenAI and Google AI supports structured outputs. aiide has a common interface for structured outputs.
Let's see how we can use structured outputs in aiide.
```python
from aiide import Aiide
from aiide.schema import structured_outputs_gen, Str
class Chatbot(Aiide):
def __init__(self):
self.setup(system_message="You are a helpful assistant.", model="gpt-4o-mini-2024-07-18")
def structured_ouputs(self):
return structured_outputs_gen(
name="chain_of_thought",
properties=[
Str(name="thinking", description="Use this field to think out loud. Breakdown the user's query, plan your response, etc."),
Str(name="response"),
],
required=["thinking", "response"],
)
agent = Chatbot()
while True:
user_input = input("\nSend a message: ")
if user_input == "exit":
break
for delta in agent.chat(user_message=user_input,json_mode=True):
if delta["type"] == "text":
print(delta["delta"],end="")
```
Test out the infamous question `How many R letters are there in the word "strawberry"?`
Now, let's break down the changes code:
- We import `structured_outputs_gen` and `Str` from `aiide.schema`. Both of these will aide(heh) in defining the structured outputs json schema.
> I did not love pydantic for defining json schemas. I have observed that a lot of developers are omitting fields such as descriptions and enums while defining schemas with pydantic. It also does not have an easier way to dynamically change the schema. So, I have created a simple interface to define structured outputs that is as flexible as possible and also helps the developer understand what they can do for each type by the help of intellisense. Please checkout the [aiide's JSON Schema](./assets/schema_definitions.md) for more information.
- We override a method `structured_outputs` in `Aiide` that returns the structured output generator. The beauty of this is that you can define multiple structured outputs return the appropriate one based on the context.
- We pass `json_mode=True` to the `chat` method. This will enable the structured outputs.
## Tools
Tools are the heart of aiide. They are the actions that the LLM can take. They can be as simple as a function call or as complex as a hierarchical LLM agents.
You define tools as classes and pass instances to the Aiide instance. The lifecycle of handling the tool calls, it's execution and feeding the response back into the LLM is all handled by aiide. You however, can still control the execution of the tool based on the values of deltas.
Let's see how we can define a tool in aiide.
```python
import random
import json
from aiide import Aiide, Tool
from aiide.schema import tool_def_gen, Str
class WeatherTool(Tool):
def __init__(self, parent):
self.error = False
def tool_def(self):
return tool_def_gen(
name="get_current_weather",
description="Get the current weather in a given location",
properties=[
Str(
name="location",
description="The city and state, e.g. San Francisco, CA",
),
Str(name="unit", enums=["celsius", "fahrenheit"]),
],
)
def main(self, location, unit="default"): # type: ignore
if self.error:
return json.dumps({"error": 404})
else:
return json.dumps({"location": location, "temperature": random.randint(0, 100), "unit": unit})
class Agent(Aiide):
def __init__(self):
# passing the chatbot instance to the tool for bi-directional communication
self.weatherTool = WeatherTool(self)
self.setup(
system_message="You are a helpful assistant.",
)
agent = Agent()
for delta in agent.chat(
user_message="What's the weather like in San Francisco, Tokyo, and Paris?",
tools=[agent.weatherTool],
):
# printing response based on the type of delta
if delta["type"] == "text":
print(delta["delta"], end="")
if delta["type"] == "tool_call":
print("Tool called:", delta["name"], "with arguments:", delta["arguments"])
if delta["type"] == "tool_response":
print("Tool response for tool:", delta["name"], " with arguments:", delta["arguments"], "is:", delta["response"])
# changing the execution of the tool based on the context of the conversation
if delta["type"] == "tool_call" and "tokyo" in delta["arguments"].lower():
agent.weatherTool.error = True
else:
agent.weatherTool.error = False
```
Let's break down the code:
- We import `Tool` from `aiide` and `tool_def_gen` and `Str` from `aiide.schema`. `Tool` is the base class for all tools in aiide. `tool_def_gen` is a tool definition generator that helps in defining the tool schema. `Str` is a string type that can have description, enums.
- We define a class `WeatherTool` that inherits from `Tool`. We define the `__init__` method to initialize the tool. We define the `tool_def` method to define the tool schema. We finally define the `main` method to define the tool logic.
- For activating the tool, we can pass the tool instance to the `chat` method through the `tools` array parameter.
- As mentioned previously, the delta has three types: `text`, `tool_call`, and `tool_response`. When type of delta is tool, the delta will have name and arguments keys for the tool called. After the tool is executed, the delta will have a type of tool_response and the response key will have the response of the tool.
- If you observe the code, we are setting a boolean flag `error` in the tool instance based on the location. This way, you can control the execution of a tool based on the context of the conversation and the tool's state. A good example would be taking user's consent before executing code.
- This way, you can activate or deactivate tools based on the context of the conversation.
#### Delta Schema
The delta schema is as follows:
| Type | Keys |
|----------------|----------------------------------------------------------------------------------------|
| text | - delta: The text content of the delta |
| | - content: The full content of the delta, including the text and any additional data |
| tool_call | - name: The name of the tool being called |
| | - arguments: The arguments passed to the tool |
| tool_response | - name: The name of the tool that generated the response |
| | - arguments: The arguments passed to the tool |
| | - response: The response generated by the tool |
## JSON Schema
As mentioned earlier, aiide has a simple interface to define JSON schemas. This is useful for defining structured outputs and tools that might change based on the context of the conversation.
Please read the tutorial on [aiide's JSON Schema](./assets/schema_definitions.md).
## Misc
Raw data
{
"_id": null,
"home_page": "https://github.com/Anilturaga/aiide",
"name": "aiide",
"maintainer": null,
"docs_url": null,
"requires_python": "<4.0,>=3.10",
"maintainer_email": null,
"keywords": "llm, openai",
"author": "Anilturaga",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/73/04/ae7671237119d5c405a5fa3de6cef4c028c091ca77507e8a3d2bdc10d2df/aiide-0.0.5.tar.gz",
"platform": null,
"description": "<div align=\"center\"><picture>\n <source media=\"(prefers-color-scheme: dark)\" srcset=\"https://github.com/Anilturaga/aiide/blob/main/assets/figures/logo_dark.svg?raw=True\">\n <img alt=\"aiide\" src=\"https://github.com/Anilturaga/aiide/blob/main/assets/figures/logo.svg?raw=True\" width=200\">\n</picture></div>\n<br/>\n\naiide is a framework to build LLM copilots.\n\nIt is born out of 3 years of experience building LLM applications starting from GPT-3 completion models to the latest frontier chat models.\n\n| What you get with aiide? | What's not part of aiide? |\n|--------------------------|--------------------------|\n| Full control over content sent to the LLM | Verbose abstractions for common prompting techniques |\n| Tools and structured outputs are first class citizens for actions and content generation | Chains as a core building block |\n| Simplified streaming by default to build real-time apps | Output parsing tools |\n| Messages history is a Pandas DataFrame | Complex nested JSON objects |\n\n\n## Table of Contents\n* [Installation](#installation)\n* [Chat](#chat)\n* [Memory](#memory)\n* [Structured Outputs](#structured-outputs)\n* [Tools](#tools)\n* [JSON Schema](#json-schema)\n* [Misc](#misc)\n\n## Installation\nLet's start by installing the package.\n```bash\npip install aiide\n```\nThis also installs LiteLM and Pandas by default. If you would like to use other LLM providers such as Anthropic or Google AI, please install the respective SDK as well.\n\nThe whole tutorial uses OpenAI models but it should work with all the popular LLM providers.\n\n## Chat\nNow that aiide is installed, let's create a simple chatbot similar to ChatGPT free tier.\n\n```python\nfrom aiide import Aiide\n\nclass Chatbot(Aiide):\n\tdef __init__(self):\n\t\tself.setup(system_message=\"You are a helpful assistant.\", model=\"gpt-4o-mini-2024-07-18\")\n\nagent = Chatbot()\nwhile True:\n user_input = input(\"\\nSend a message: \")\n if user_input == \"exit\":\n break\n for delta in agent.chat(user_message=user_input):\n if delta[\"type\"] == \"text\":\n print(delta[\"delta\"],end=\"\")\n\t\n```\n\nLet's break down the code:\n- We defined a class Chatbot that inherits from Aiide. Classes are a great way to encapsulate the chatbot logic and state. It also enables sharing state with tools and structured outputs as we will see later.\n- We set up the chatbot with a system message and a model. The model can be anything supported by LiteLM.\n- We then create an instance of the Chatbot and start a loop to chat with the bot.\n- The chat loop returns a generator of deltas. Each delta has a type and content. The type can be either text, tool_call or tool_response. If the delta type is text, it will have delta and content as it's keys. If the delta type is tool_call, it will have name and arguments as it's keys. If the delta type is tool_response, it will have name, arguments and response as it's keys.\n\n\n#### User Message Input\n`user_message` can take a couple of types of inputs. It can be a string as you've just seen, it can be an image object(`PIL.Image`) it can be an array of strings and images.\n\nDifferent ways to use user_message:\n```python\nuser_message = \"What's the weather like in SF\"\nimage = Image.open(\"image.jpg\")\nuser_message = image\nuser_message = [image, \"Annotate the attached image\"]\n```\n<!--\nuser_message = {\"RAG\":\"Some large content\",\"query\":\"Actual user message\"}\nReasoning for using dict as input:\nWhen you pass a dict for user_message in aiide, it will only pass in the values to the LLM. The keys are instead useful to later update or remove pieces of information from the memory/chat history.\n-->\n\n## Memory\n\nA natural question for the above snippet is how do we track the chat history?\n\naiide has first-class support for memory. I found that handling OpenAI JSON based schema is cumbersome and error-prone. So, I had abstracted the chat history into a Pandas DataFrame.\n\n`messages` is a pandas DataFrame that stores all the messages, tool calls and responses of a chat session. \n\nYou can access the memory of the chatbot by calling `agent.messages` in the above example.\n\nThe schema of the messages DataFrame is as follows:\n\n| role | content | arguments | response |\n|-------------|-----------|-------------| ------------ |\n| system | You are a helpful assistant | None | None |\n| user | What's the weather like in SF | None | None |\n| tool | {'name':'get_weather','id':'abc'} | {'location':'SF'} | {'temperature':'72'} |\n| assistant | The weather is 72 degress right now. | None | None |\n\nYou can use the memory DataFrame to analyze and manipulate the chat history and the tool calls and responses.\n\n## Structured Outputs\n\nCurrently the LLM can respond with text in any format. Sometimes it thinks first, sometimes it will answer in code right away. What if we want to structure the output in a specific way?\n\nCurrently to my knowledge, OpenAI and Google AI supports structured outputs. aiide has a common interface for structured outputs.\n\nLet's see how we can use structured outputs in aiide.\n\n```python\nfrom aiide import Aiide\nfrom aiide.schema import structured_outputs_gen, Str\n\nclass Chatbot(Aiide):\n def __init__(self):\n self.setup(system_message=\"You are a helpful assistant.\", model=\"gpt-4o-mini-2024-07-18\")\n def structured_ouputs(self):\n return structured_outputs_gen(\n name=\"chain_of_thought\",\n properties=[\n Str(name=\"thinking\", description=\"Use this field to think out loud. Breakdown the user's query, plan your response, etc.\"),\n Str(name=\"response\"),\n ],\n required=[\"thinking\", \"response\"],\n )\n\nagent = Chatbot()\nwhile True:\n user_input = input(\"\\nSend a message: \")\n if user_input == \"exit\":\n break\n for delta in agent.chat(user_message=user_input,json_mode=True):\n if delta[\"type\"] == \"text\":\n print(delta[\"delta\"],end=\"\")\n```\nTest out the infamous question `How many R letters are there in the word \"strawberry\"?`\n\nNow, let's break down the changes code:\n- We import `structured_outputs_gen` and `Str` from `aiide.schema`. Both of these will aide(heh) in defining the structured outputs json schema.\n> I did not love pydantic for defining json schemas. I have observed that a lot of developers are omitting fields such as descriptions and enums while defining schemas with pydantic. It also does not have an easier way to dynamically change the schema. So, I have created a simple interface to define structured outputs that is as flexible as possible and also helps the developer understand what they can do for each type by the help of intellisense. Please checkout the [aiide's JSON Schema](./assets/schema_definitions.md) for more information.\n- We override a method `structured_outputs` in `Aiide` that returns the structured output generator. The beauty of this is that you can define multiple structured outputs return the appropriate one based on the context.\n- We pass `json_mode=True` to the `chat` method. This will enable the structured outputs.\n\n\n## Tools\n\nTools are the heart of aiide. They are the actions that the LLM can take. They can be as simple as a function call or as complex as a hierarchical LLM agents.\n\nYou define tools as classes and pass instances to the Aiide instance. The lifecycle of handling the tool calls, it's execution and feeding the response back into the LLM is all handled by aiide. You however, can still control the execution of the tool based on the values of deltas.\n\nLet's see how we can define a tool in aiide.\n\n```python\nimport random\nimport json\nfrom aiide import Aiide, Tool\nfrom aiide.schema import tool_def_gen, Str\n\nclass WeatherTool(Tool):\n def __init__(self, parent):\n self.error = False\n\n def tool_def(self):\n return tool_def_gen(\n name=\"get_current_weather\",\n description=\"Get the current weather in a given location\",\n properties=[\n Str(\n name=\"location\",\n description=\"The city and state, e.g. San Francisco, CA\",\n ),\n Str(name=\"unit\", enums=[\"celsius\", \"fahrenheit\"]),\n ],\n )\n\n\n def main(self, location, unit=\"default\"): # type: ignore\n if self.error:\n return json.dumps({\"error\": 404})\n else:\n return json.dumps({\"location\": location, \"temperature\": random.randint(0, 100), \"unit\": unit})\n\nclass Agent(Aiide):\n def __init__(self):\n # passing the chatbot instance to the tool for bi-directional communication\n self.weatherTool = WeatherTool(self)\n self.setup(\n system_message=\"You are a helpful assistant.\",\n )\n\nagent = Agent()\nfor delta in agent.chat(\n user_message=\"What's the weather like in San Francisco, Tokyo, and Paris?\",\n tools=[agent.weatherTool],\n):\n # printing response based on the type of delta\n if delta[\"type\"] == \"text\":\n print(delta[\"delta\"], end=\"\")\n if delta[\"type\"] == \"tool_call\":\n print(\"Tool called:\", delta[\"name\"], \"with arguments:\", delta[\"arguments\"])\n if delta[\"type\"] == \"tool_response\":\n print(\"Tool response for tool:\", delta[\"name\"], \" with arguments:\", delta[\"arguments\"], \"is:\", delta[\"response\"])\n\n # changing the execution of the tool based on the context of the conversation\n if delta[\"type\"] == \"tool_call\" and \"tokyo\" in delta[\"arguments\"].lower():\n agent.weatherTool.error = True\n else:\n agent.weatherTool.error = False\n\n\n```\n\nLet's break down the code:\n- We import `Tool` from `aiide` and `tool_def_gen` and `Str` from `aiide.schema`. `Tool` is the base class for all tools in aiide. `tool_def_gen` is a tool definition generator that helps in defining the tool schema. `Str` is a string type that can have description, enums.\n- We define a class `WeatherTool` that inherits from `Tool`. We define the `__init__` method to initialize the tool. We define the `tool_def` method to define the tool schema. We finally define the `main` method to define the tool logic.\n- For activating the tool, we can pass the tool instance to the `chat` method through the `tools` array parameter. \n- As mentioned previously, the delta has three types: `text`, `tool_call`, and `tool_response`. When type of delta is tool, the delta will have name and arguments keys for the tool called. After the tool is executed, the delta will have a type of tool_response and the response key will have the response of the tool.\n- If you observe the code, we are setting a boolean flag `error` in the tool instance based on the location. This way, you can control the execution of a tool based on the context of the conversation and the tool's state. A good example would be taking user's consent before executing code.\n- This way, you can activate or deactivate tools based on the context of the conversation.\n\n#### Delta Schema\n\nThe delta schema is as follows:\n\n| Type | Keys |\n|----------------|----------------------------------------------------------------------------------------|\n| text | - delta: The text content of the delta |\n| | - content: The full content of the delta, including the text and any additional data |\n| tool_call | - name: The name of the tool being called |\n| | - arguments: The arguments passed to the tool |\n| tool_response | - name: The name of the tool that generated the response |\n| | - arguments: The arguments passed to the tool |\n| | - response: The response generated by the tool |\n\n## JSON Schema\n\nAs mentioned earlier, aiide has a simple interface to define JSON schemas. This is useful for defining structured outputs and tools that might change based on the context of the conversation.\n\nPlease read the tutorial on [aiide's JSON Schema](./assets/schema_definitions.md).\n\n## Misc\n\n",
"bugtrack_url": null,
"license": "Apache-2.0",
"summary": "aiide is a framework to build LLM copilots.",
"version": "0.0.5",
"project_urls": {
"Homepage": "https://github.com/Anilturaga/aiide",
"Repository": "https://github.com/Anilturaga/aiide"
},
"split_keywords": [
"llm",
" openai"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "8b92078b3542f8944145f5ca957a04d5ab2f842bf4c7e5fb7dbf6ea60e3f8eca",
"md5": "7b63db3a904d6557dfd152b79ee5c40f",
"sha256": "90fbd8cd54b475faa4e66877222710c61f3a941b8d20ad65da5a89302b04a4a6"
},
"downloads": -1,
"filename": "aiide-0.0.5-py3-none-any.whl",
"has_sig": false,
"md5_digest": "7b63db3a904d6557dfd152b79ee5c40f",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<4.0,>=3.10",
"size": 16685,
"upload_time": "2024-09-29T19:56:07",
"upload_time_iso_8601": "2024-09-29T19:56:07.059927Z",
"url": "https://files.pythonhosted.org/packages/8b/92/078b3542f8944145f5ca957a04d5ab2f842bf4c7e5fb7dbf6ea60e3f8eca/aiide-0.0.5-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "7304ae7671237119d5c405a5fa3de6cef4c028c091ca77507e8a3d2bdc10d2df",
"md5": "7073b8634f3ae7ed453389b0163d2f74",
"sha256": "734c79fa85659c6b67145d4bab7d8bd49f6b8fc44d06ae8c92737f03ebea86e8"
},
"downloads": -1,
"filename": "aiide-0.0.5.tar.gz",
"has_sig": false,
"md5_digest": "7073b8634f3ae7ed453389b0163d2f74",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<4.0,>=3.10",
"size": 18851,
"upload_time": "2024-09-29T19:56:08",
"upload_time_iso_8601": "2024-09-29T19:56:08.351983Z",
"url": "https://files.pythonhosted.org/packages/73/04/ae7671237119d5c405a5fa3de6cef4c028c091ca77507e8a3d2bdc10d2df/aiide-0.0.5.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-09-29 19:56:08",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "Anilturaga",
"github_project": "aiide",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "aiide"
}