Name | aios-agent-sdk JSON |
Version |
0.0.2
JSON |
| download |
home_page | None |
Summary | Cerebrum: Agent SDK for AIOS |
upload_time | 2024-11-26 19:00:06 |
maintainer | None |
docs_url | None |
author | None |
requires_python | <3.12,>=3.10 |
license | MIT License Copyright (c) 2024 AGI Research Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
keywords |
agi
aios
cerebrum
llm
os
|
VCS |
|
bugtrack_url |
|
requirements |
requests
platformdirs
pydantic
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
# Cerebrum: Agent SDK for AIOS
<a href='https://aios-3.gitbook.io/'><img src='https://img.shields.io/badge/Documentation-Cerebrum-blue'></a>
[![Code License](https://img.shields.io/badge/Code%20License-MIT-orange.svg)](https://github.com/agiresearch/AIOS/blob/main/LICENSE)
<a href='https://discord.gg/B2HFxEgTJX'><img src='https://img.shields.io/badge/Community-Discord-8A2BE2'></a>
The goal of AIOS is to build a Large Language Model (LLM) agent operating system, which intends to embed large language model into the operating system as the brain of the OS. AIOS is designed to address problems (e.g., scheduling, context switch, memory management, etc.) during the development and deployment of LLM-based agents, for a better ecosystem among agent developers and users.
## 🏠 Cerebrum Architecture
<p align="center">
<img src="docs/assets/aios-figs/architecture.jpg">
</p>
AIOS provides the AIOS kernel as an abstraction on top of the OS kernel. The kernel facilitates the installation, execution and usage of agents. Furthermore, the AIOS SDK facilitates the development and deployment of agents.
## 📰 News
- **[2024-11-026]** 🔥 Cerebrum is available for public release on PyPi!
## Installation
### Standard Installation
1. **Install the package**
```bash
pip install aios-agent-sdk
```
4. **Verify installation**
```bash
python -c "import cerebrum; from cerebrum.client import Cerebrum; print(Cerebrum)"
```
### Install From Source
1. **Clone Repo**
```bash
git clone https://github.com/agiresearch/Cerebrum.git
cd Cerebrum
```
3. **Create Virtual Environment**
```bash
conda create -n cerebrum-env python=3.10
```
or
```bash
conda create -n cerebrum-env python=3.11
```
or
```bash
# Windows (cmd)
python -m venv cerebrum-env
# Linux/MacOS
python3 -m venv cerebrum-env
```
4. **Activate the environment**
```bash
conda activate myenv
```
or
```bash
# Windows (cmd)
cd cerebrum-env
cd Scripts
activate.bat
cd ..
cd ..
# Linux/MacOS
source cerebrum-env/bin/activate
```
6. **Install the package**
```bash
pip install -e .
```
7. **Verify installation**
```bash
python -c "import cerebrum; from cerebrum.client import Cerebrum; print(Cerebrum)"
```
## ✈️ Quickstart
> [!TIP]
>
> Please see our ongoing [documentation](https://aios-3.gitbook.io/) for more information.
1. **Start the AIOS Kernel**
📝 See [here](https://aios-3.gitbook.io/aios-docs/getting-started/installation).
2. **Run the AIOS Basic Demo**
```bash
aios-basic-demo
```
Code file is located at `example/aios_demo.py`
3. **Run the AIOS Concurrent Agent Demo**
```bash
aios-concurrent-demo
```
Code file is located at `example/aios_demo_concurrent.py`
### Supported LLM Cores
| Provider 🏢 | Model Name 🤖 | Open Source 🔓 | Model String ⌨️ | Backend ⚙️ |
|:------------|:-------------|:---------------|:---------------|:---------------|
| Anthropic | Claude 3.5 Sonnet | ❌ | claude-3-5-sonnet-20241022 |anthropic |
| Anthropic | Claude 3.5 Haiku | ❌ | claude-3-5-haiku-20241022 |anthropic |
| Anthropic | Claude 3 Opus | ❌ | claude-3-opus-20240229 |anthropic |
| Anthropic | Claude 3 Sonnet | ❌ | claude-3-sonnet-20240229 |anthropic |
| Anthropic | Claude 3 Haiku | ❌ | claude-3-haiku-20240307 |anthropic |
| OpenAI | GPT-4 | ❌ | gpt-4 |openai|
| OpenAI | GPT-4 Turbo | ❌ | gpt-4-turbo |openai|
| OpenAI | GPT-4o | ❌ | gpt-4o |openai|
| OpenAI | GPT-4o mini | ❌ | gpt-4o-mini |openai|
| OpenAI | GPT-3.5 Turbo | ❌ | gpt-3.5-turbo |openai|
| Google | Gemini 1.5 Flash | ❌ | gemini-1.5-flash |google|
| Google | Gemini 1.5 Flash-8B | ❌ | gemini-1.5-flash-8b |google|
| Google | Gemini 1.5 Pro | ❌ | gemini-1.5-pro |google|
| Google | Gemini 1.0 Pro | ❌ | gemini-1.0-pro |google|
| Groq | Llama 3.2 90B Vision | ✅ | llama-3.2-90b-vision-preview |groq|
| Groq | Llama 3.2 11B Vision | ✅ | llama-3.2-11b-vision-preview |groq|
| Groq | Llama 3.1 70B | ✅ | llama-3.1-70b-versatile |groq|
| Groq | Llama Guard 3 8B | ✅ | llama-guard-3-8b |groq|
| Groq | Llama 3 70B | ✅ | llama3-70b-8192 |groq|
| Groq | Llama 3 8B | ✅ | llama3-8b-8192 |groq|
| Groq | Mixtral 8x7B | ✅ | mixtral-8x7b-32768 |groq|
| Groq | Gemma 7B | ✅ | gemma-7b-it |groq|
| Groq | Gemma 2B | ✅ | gemma2-9b-it |groq|
| Groq | Llama3 Groq 70B | ✅ | llama3-groq-70b-8192-tool-use-preview |groq|
| Groq | Llama3 Groq 8B | ✅ | llama3-groq-8b-8192-tool-use-preview |groq|
| Ollama[^bignote1] | All Models | ✅ | model-name |ollama|
| VLLM[^bignote2] | All Models | ✅ | model-name |vllm|
| HuggingFace[^bignote3] | All Models | ✅ | model-name |huggingface|
[^bignote1]: **Ollama Model String Format:**
- Format: `model-name`. Some examples are `llama2`, `mistral`, or `codellama`.
- Ollama allows you to run models locally using their CLI tool and API
- See AIOS documentation for setting up Ollama
[^bignote2]: **VLLM Model String Format:**
- Format: `model-name`. Some examples are `mistral-7b-instruct`, `llama2-70b`, or `mixtral-8x7b`.
- VLLM is a high-performance inference engine that can serve any HuggingFace model
- You can specify custom model paths: `vllm/path/to/your/model`
- Supports quantization parameters: `vllm/mistral-7b-instruct-4bit`
- See AIOS documentation for setting up VLLM
[^bignote3]: **HuggingFace Model String Format:**
- Format: `owner/model-name`. Some examples are `mistralai/Mistral-7B-v0.1`, `meta-llama/Llama-2-70b-hf`, or `TheBloke/Llama-2-13B-GPTQ`.
- Can include specific versions: `owner/model-name@v1.0`
- Supports multiple model formats (PyTorch, Safetensors, GGUF, GPTQ)
- Can reference private models with auth token: `huggingface/username/private-model`
- See AIOS documentation for setting up HuggingFace (if necessary)
## 🖋️ References
```
@article{mei2024aios,
title={AIOS: LLM Agent Operating System},
author={Mei, Kai and Li, Zelong and Xu, Shuyuan and Ye, Ruosong and Ge, Yingqiang and Zhang, Yongfeng}
journal={arXiv:2403.16971},
year={2024}
}
@article{ge2023llm,
title={LLM as OS, Agents as Apps: Envisioning AIOS, Agents and the AIOS-Agent Ecosystem},
author={Ge, Yingqiang and Ren, Yujie and Hua, Wenyue and Xu, Shuyuan and Tan, Juntao and Zhang, Yongfeng},
journal={arXiv:2312.03815},
year={2023}
}
```
## 🚀 Contributions
For how to contribute, see [CONTRIBUTE](https://github.com/agiresearch/Cerebrum/blob/main/CONTRIBUTE.md). If you would like to contribute to the codebase, [issues](https://github.com/agiresearch/Cerebrum/issues) or [pull requests](https://github.com/agiresearch/Cerebrum/pulls) are always welcome!
## 🌍 Cerebrum Contributors
[![Cerebrum contributors](https://contrib.rocks/image?repo=agiresearch/Cerebrum&max=300)](https://github.com/agiresearch/Cerebrum/graphs/contributors)
## 🤝 Discord Channel
If you would like to join the community, ask questions, chat with fellows, learn about or propose new features, and participate in future developments, join our [Discord Community](https://discord.gg/B2HFxEgTJX)!
## 📪 Contact
For issues related to Cerebrum development, we encourage submitting [issues](https://github.com/agiresearch/Cerebrum/issues), [pull requests](https://github.com/agiresearch/Cerebrum/pulls), or initiating discussions in AIOS [Discord Channel](https://discord.gg/B2HFxEgTJX). For other issues please feel free to contact the AIOS Foundation ([contact@aios.foundation](mailto:contact@aios.foundation)).
Raw data
{
"_id": null,
"home_page": null,
"name": "aios-agent-sdk",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.12,>=3.10",
"maintainer_email": null,
"keywords": "agi, aios, cerebrum, llm, os",
"author": null,
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/42/b9/519ba5a29b1487ab07916e5661277d971d70e28e0d95ee8a83284ad505cc/aios_agent_sdk-0.0.2.tar.gz",
"platform": null,
"description": "# Cerebrum: Agent SDK for AIOS\n\n<a href='https://aios-3.gitbook.io/'><img src='https://img.shields.io/badge/Documentation-Cerebrum-blue'></a>\n[![Code License](https://img.shields.io/badge/Code%20License-MIT-orange.svg)](https://github.com/agiresearch/AIOS/blob/main/LICENSE)\n<a href='https://discord.gg/B2HFxEgTJX'><img src='https://img.shields.io/badge/Community-Discord-8A2BE2'></a>\n\nThe goal of AIOS is to build a Large Language Model (LLM) agent operating system, which intends to embed large language model into the operating system as the brain of the OS. AIOS is designed to address problems (e.g., scheduling, context switch, memory management, etc.) during the development and deployment of LLM-based agents, for a better ecosystem among agent developers and users.\n\n## \ud83c\udfe0 Cerebrum Architecture\n<p align=\"center\">\n<img src=\"docs/assets/aios-figs/architecture.jpg\">\n</p>\n\n\nAIOS provides the AIOS kernel as an abstraction on top of the OS kernel. The kernel facilitates the installation, execution and usage of agents. Furthermore, the AIOS SDK facilitates the development and deployment of agents.\n\n## \ud83d\udcf0 News\n- **[2024-11-026]** \ud83d\udd25 Cerebrum is available for public release on PyPi!\n\n## Installation\n\n### Standard Installation\n\n1. **Install the package**\n ```bash\n pip install aios-agent-sdk\n ```\n\n4. **Verify installation**\n ```bash\n python -c \"import cerebrum; from cerebrum.client import Cerebrum; print(Cerebrum)\"\n ```\n\n### Install From Source\n1. **Clone Repo**\n ```bash\n git clone https://github.com/agiresearch/Cerebrum.git\n\n cd Cerebrum\n ```\n\n3. **Create Virtual Environment**\n ```bash\n conda create -n cerebrum-env python=3.10\n ```\n or\n ```bash\n conda create -n cerebrum-env python=3.11\n ```\n or\n ```bash\n # Windows (cmd)\n python -m venv cerebrum-env\n\n # Linux/MacOS\n python3 -m venv cerebrum-env\n ```\n\n4. **Activate the environment**\n ```bash\n conda activate myenv\n ```\n or\n ```bash\n # Windows (cmd)\n cd cerebrum-env\n cd Scripts\n activate.bat\n cd ..\n cd ..\n \n\n # Linux/MacOS\n source cerebrum-env/bin/activate\n ```\n\n6. **Install the package**\n ```bash\n pip install -e .\n ```\n\n7. **Verify installation**\n ```bash\n python -c \"import cerebrum; from cerebrum.client import Cerebrum; print(Cerebrum)\"\n ```\n\n## \u2708\ufe0f Quickstart\n> [!TIP] \n>\n> Please see our ongoing [documentation](https://aios-3.gitbook.io/) for more information.\n\n1. **Start the AIOS Kernel** \n \ud83d\udcdd See [here](https://aios-3.gitbook.io/aios-docs/getting-started/installation).\n\n2. **Run the AIOS Basic Demo**\n ```bash\n aios-basic-demo\n ```\n\n Code file is located at `example/aios_demo.py`\n\n3. **Run the AIOS Concurrent Agent Demo**\n ```bash\n aios-concurrent-demo\n ```\n\n Code file is located at `example/aios_demo_concurrent.py`\n\n### Supported LLM Cores\n\n| Provider \ud83c\udfe2 | Model Name \ud83e\udd16 | Open Source \ud83d\udd13 | Model String \u2328\ufe0f | Backend \u2699\ufe0f |\n|:------------|:-------------|:---------------|:---------------|:---------------|\n| Anthropic | Claude 3.5 Sonnet | \u274c | claude-3-5-sonnet-20241022 |anthropic |\n| Anthropic | Claude 3.5 Haiku | \u274c | claude-3-5-haiku-20241022 |anthropic |\n| Anthropic | Claude 3 Opus | \u274c | claude-3-opus-20240229 |anthropic |\n| Anthropic | Claude 3 Sonnet | \u274c | claude-3-sonnet-20240229 |anthropic |\n| Anthropic | Claude 3 Haiku | \u274c | claude-3-haiku-20240307 |anthropic |\n| OpenAI | GPT-4 | \u274c | gpt-4 |openai|\n| OpenAI | GPT-4 Turbo | \u274c | gpt-4-turbo |openai|\n| OpenAI | GPT-4o | \u274c | gpt-4o |openai|\n| OpenAI | GPT-4o mini | \u274c | gpt-4o-mini |openai|\n| OpenAI | GPT-3.5 Turbo | \u274c | gpt-3.5-turbo |openai|\n| Google | Gemini 1.5 Flash | \u274c | gemini-1.5-flash |google|\n| Google | Gemini 1.5 Flash-8B | \u274c | gemini-1.5-flash-8b |google|\n| Google | Gemini 1.5 Pro | \u274c | gemini-1.5-pro |google|\n| Google | Gemini 1.0 Pro | \u274c | gemini-1.0-pro |google|\n| Groq | Llama 3.2 90B Vision | \u2705 | llama-3.2-90b-vision-preview |groq|\n| Groq | Llama 3.2 11B Vision | \u2705 | llama-3.2-11b-vision-preview |groq|\n| Groq | Llama 3.1 70B | \u2705 | llama-3.1-70b-versatile |groq|\n| Groq | Llama Guard 3 8B | \u2705 | llama-guard-3-8b |groq|\n| Groq | Llama 3 70B | \u2705 | llama3-70b-8192 |groq|\n| Groq | Llama 3 8B | \u2705 | llama3-8b-8192 |groq|\n| Groq | Mixtral 8x7B | \u2705 | mixtral-8x7b-32768 |groq|\n| Groq | Gemma 7B | \u2705 | gemma-7b-it |groq|\n| Groq | Gemma 2B | \u2705 | gemma2-9b-it |groq|\n| Groq | Llama3 Groq 70B | \u2705 | llama3-groq-70b-8192-tool-use-preview |groq|\n| Groq | Llama3 Groq 8B | \u2705 | llama3-groq-8b-8192-tool-use-preview |groq|\n| Ollama[^bignote1] | All Models | \u2705 | model-name |ollama|\n| VLLM[^bignote2] | All Models | \u2705 | model-name |vllm|\n| HuggingFace[^bignote3] | All Models | \u2705 | model-name |huggingface|\n\n[^bignote1]: **Ollama Model String Format:**\n - Format: `model-name`. Some examples are `llama2`, `mistral`, or `codellama`.\n - Ollama allows you to run models locally using their CLI tool and API\n - See AIOS documentation for setting up Ollama\n\n[^bignote2]: **VLLM Model String Format:**\n - Format: `model-name`. Some examples are `mistral-7b-instruct`, `llama2-70b`, or `mixtral-8x7b`.\n - VLLM is a high-performance inference engine that can serve any HuggingFace model\n - You can specify custom model paths: `vllm/path/to/your/model`\n - Supports quantization parameters: `vllm/mistral-7b-instruct-4bit`\n - See AIOS documentation for setting up VLLM\n\n[^bignote3]: **HuggingFace Model String Format:**\n - Format: `owner/model-name`. Some examples are `mistralai/Mistral-7B-v0.1`, `meta-llama/Llama-2-70b-hf`, or `TheBloke/Llama-2-13B-GPTQ`.\n - Can include specific versions: `owner/model-name@v1.0`\n - Supports multiple model formats (PyTorch, Safetensors, GGUF, GPTQ)\n - Can reference private models with auth token: `huggingface/username/private-model`\n - See AIOS documentation for setting up HuggingFace (if necessary)\n\n\n## \ud83d\udd8b\ufe0f References\n```\n@article{mei2024aios,\n title={AIOS: LLM Agent Operating System},\n author={Mei, Kai and Li, Zelong and Xu, Shuyuan and Ye, Ruosong and Ge, Yingqiang and Zhang, Yongfeng}\n journal={arXiv:2403.16971},\n year={2024}\n}\n@article{ge2023llm,\n title={LLM as OS, Agents as Apps: Envisioning AIOS, Agents and the AIOS-Agent Ecosystem},\n author={Ge, Yingqiang and Ren, Yujie and Hua, Wenyue and Xu, Shuyuan and Tan, Juntao and Zhang, Yongfeng},\n journal={arXiv:2312.03815},\n year={2023}\n}\n```\n\n## \ud83d\ude80 Contributions\nFor how to contribute, see [CONTRIBUTE](https://github.com/agiresearch/Cerebrum/blob/main/CONTRIBUTE.md). If you would like to contribute to the codebase, [issues](https://github.com/agiresearch/Cerebrum/issues) or [pull requests](https://github.com/agiresearch/Cerebrum/pulls) are always welcome!\n\n## \ud83c\udf0d Cerebrum Contributors\n[![Cerebrum contributors](https://contrib.rocks/image?repo=agiresearch/Cerebrum&max=300)](https://github.com/agiresearch/Cerebrum/graphs/contributors)\n\n\n## \ud83e\udd1d Discord Channel\nIf you would like to join the community, ask questions, chat with fellows, learn about or propose new features, and participate in future developments, join our [Discord Community](https://discord.gg/B2HFxEgTJX)!\n\n## \ud83d\udcea Contact\n\nFor issues related to Cerebrum development, we encourage submitting [issues](https://github.com/agiresearch/Cerebrum/issues), [pull requests](https://github.com/agiresearch/Cerebrum/pulls), or initiating discussions in AIOS [Discord Channel](https://discord.gg/B2HFxEgTJX). For other issues please feel free to contact the AIOS Foundation ([contact@aios.foundation](mailto:contact@aios.foundation)).\n\n\n\n\n",
"bugtrack_url": null,
"license": "MIT License Copyright (c) 2024 AGI Research Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.",
"summary": "Cerebrum: Agent SDK for AIOS",
"version": "0.0.2",
"project_urls": {
"Homepage": "https://github.com/agiresearch/Cerebrum",
"Repository": "https://github.com/agiresearch/Cerebrum.git"
},
"split_keywords": [
"agi",
" aios",
" cerebrum",
" llm",
" os"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "b9be1a128f830f20f0a19887467eb4d73ecc8842accceda4b4016399d8b9acb8",
"md5": "4306b84235d92a136eb747b4f6759e6d",
"sha256": "9067a16af4c5617db9dfee7c096aacc69218e20d42ce42239861ca873173d858"
},
"downloads": -1,
"filename": "aios_agent_sdk-0.0.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "4306b84235d92a136eb747b4f6759e6d",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.12,>=3.10",
"size": 59776,
"upload_time": "2024-11-26T19:00:05",
"upload_time_iso_8601": "2024-11-26T19:00:05.472313Z",
"url": "https://files.pythonhosted.org/packages/b9/be/1a128f830f20f0a19887467eb4d73ecc8842accceda4b4016399d8b9acb8/aios_agent_sdk-0.0.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "42b9519ba5a29b1487ab07916e5661277d971d70e28e0d95ee8a83284ad505cc",
"md5": "a1079815cf4b6e6b3ef63f75bbe6905e",
"sha256": "872afc843fda8dfc75596963b02a455ab11fb63cbf14cf07b55c2dee5d62cd97"
},
"downloads": -1,
"filename": "aios_agent_sdk-0.0.2.tar.gz",
"has_sig": false,
"md5_digest": "a1079815cf4b6e6b3ef63f75bbe6905e",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.12,>=3.10",
"size": 30512,
"upload_time": "2024-11-26T19:00:06",
"upload_time_iso_8601": "2024-11-26T19:00:06.693433Z",
"url": "https://files.pythonhosted.org/packages/42/b9/519ba5a29b1487ab07916e5661277d971d70e28e0d95ee8a83284ad505cc/aios_agent_sdk-0.0.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-11-26 19:00:06",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "agiresearch",
"github_project": "Cerebrum",
"travis_ci": false,
"coveralls": false,
"github_actions": false,
"requirements": [
{
"name": "requests",
"specs": []
},
{
"name": "platformdirs",
"specs": []
},
{
"name": "pydantic",
"specs": []
}
],
"lcname": "aios-agent-sdk"
}