Name | autogluon.assistant JSON |
Version |
1.0.0
JSON |
| download |
home_page | None |
Summary | Multi Agent System for Multimodal ML Automation |
upload_time | 2025-07-11 19:19:49 |
maintainer | None |
docs_url | None |
author | AutoGluon Community |
requires_python | <3.12,>=3.8 |
license | None |
keywords |
|
VCS |
 |
bugtrack_url |
|
requirements |
No requirements were recorded.
|
Travis-CI |
No Travis.
|
coveralls test coverage |
No coveralls.
|
<table>
<tr>
<td width="70%">
# AutoGluon Assistant (aka MLZero)
[](https://pypi.org/project/autogluon.assistant/)
[](./LICENSE)
[](https://github.com/autogluon/autogluon-assistant/actions/workflows/continuous_integration.yml)
[](https://project-mlzero.github.io/)
</td>
<td>
<img src="https://user-images.githubusercontent.com/16392542/77208906-224aa500-6aba-11ea-96bd-e81806074030.png" width="350">
</td>
</tr>
</table>
> **Official implementation** of [MLZero: A Multi-Agent System for End-to-end Machine Learning Automation](https://arxiv.org/abs/2505.13941)
AutoGluon Assistant (aka MLZero) is a multi-agent system that automates end-to-end multimodal machine learning or deep learning workflows by transforming raw multimodal data into high-quality ML solutions with zero human intervention. Leveraging specialized perception agents, dual-memory modules, and iterative code generation, it handles diverse data formats while maintaining high success rates across complex ML tasks.
## 💾 Installation
AutoGluon Assistant is supported on Python 3.8 - 3.11 and is available on Linux (will fix dependency issues for MacOS and Windows by our next official release).
You can install from source (new version will be released to PyPI soon):
```bash
pip install uv
uv pip install git+https://github.com/autogluon/autogluon-assistant.git
```
## Quick Start
For detailed usage instructions, Anthropic/Azure/OpenAI setup, and advanced configuration options, see our [Getting Started Tutorial](docs/tutorials/getting_started.md).
### 1. API Setup
MLZero uses AWS Bedrock by default. Configure your AWS credentials:
```bash
export AWS_DEFAULT_REGION="<your-region>"
export AWS_ACCESS_KEY_ID="<your-access-key>"
export AWS_SECRET_ACCESS_KEY="<your-secret-key>"
```
We also support Anthropic, Azure, and OpenAI. Support for more LLM providers (e.g. DeepSeek, etc.) will be added soon.
### 2.1 CLI

```bash
mlzero -i <input_data_folder> [-t <optional_user_instructions>]
```
### 2.2 Web UI

```bash
mlzero-backend # command to start backend
mlzero-frontend # command to start frontend on 8509(default)
```
1. **Configure**: Set your model provider and credentials in settings
2. **Upload & Describe**: Drag your data folder into the chat input box, then type what you want to accomplish and press Enter
### 2.3 MCP (Model Context Protocol)
Note: The system can run on a single machine or distributed across multiple machines (e.g., server on EC2, client on local).
1. **Start the server**
```bash
cd autogluon-assistant
mlzero-backend # command to start backend
mlzero-mcp-server # This will start the service—run it in a new terminal.
```
2. **Start the client**
```bash
cd autogluon-assistant
mlzero-mcp-client
```
Note: You may need to set up port tunneling to expose your local MCP Client Server (port 8005) if you want to use it with remote LLM services (e.g., Claude API, OpenAI API).
### 2.4 Python API
```python
from autogluon.assistant.coding_agent import run_agent
run_agent(
input_data_folder=<your-input-folder>,
output_folder=<your-output-folder>,
# more args ...
)
```
## Citation
If you use Autogluon Assistant (MLZero) in your research, please cite our paper:
```bibtex
@misc{fang2025mlzeromultiagentendtoendmachine,
title={MLZero: A Multi-Agent System for End-to-end Machine Learning Automation},
author={Haoyang Fang and Boran Han and Nick Erickson and Xiyuan Zhang and Su Zhou and Anirudh Dagar and Jiani Zhang and Ali Caner Turkmen and Cuixiong Hu and Huzefa Rangwala and Ying Nian Wu and Bernie Wang and George Karypis},
year={2025},
eprint={2505.13941},
archivePrefix={arXiv},
primaryClass={cs.MA},
url={https://arxiv.org/abs/2505.13941},
}
```
Raw data
{
"_id": null,
"home_page": null,
"name": "autogluon.assistant",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.12,>=3.8",
"maintainer_email": null,
"keywords": null,
"author": "AutoGluon Community",
"author_email": null,
"download_url": "https://files.pythonhosted.org/packages/c9/98/e3fe520b455e1de54742a39d70957e8660ad381052402ed91782ad2bcf6b/autogluon_assistant-1.0.0.tar.gz",
"platform": null,
"description": "<table>\n<tr>\n<td width=\"70%\">\n\n# AutoGluon Assistant (aka MLZero)\n[](https://pypi.org/project/autogluon.assistant/)\n[](./LICENSE)\n[](https://github.com/autogluon/autogluon-assistant/actions/workflows/continuous_integration.yml)\n[](https://project-mlzero.github.io/)\n\n</td>\n<td>\n<img src=\"https://user-images.githubusercontent.com/16392542/77208906-224aa500-6aba-11ea-96bd-e81806074030.png\" width=\"350\">\n</td>\n</tr>\n</table>\n\n> **Official implementation** of [MLZero: A Multi-Agent System for End-to-end Machine Learning Automation](https://arxiv.org/abs/2505.13941)\n\nAutoGluon Assistant (aka MLZero) is a multi-agent system that automates end-to-end multimodal machine learning or deep learning workflows by transforming raw multimodal data into high-quality ML solutions with zero human intervention. Leveraging specialized perception agents, dual-memory modules, and iterative code generation, it handles diverse data formats while maintaining high success rates across complex ML tasks.\n\n## \ud83d\udcbe Installation\n\nAutoGluon Assistant is supported on Python 3.8 - 3.11 and is available on Linux (will fix dependency issues for MacOS and Windows by our next official release).\n\nYou can install from source (new version will be released to PyPI soon):\n\n```bash\npip install uv\nuv pip install git+https://github.com/autogluon/autogluon-assistant.git\n```\n\n## Quick Start\n\nFor detailed usage instructions, Anthropic/Azure/OpenAI setup, and advanced configuration options, see our [Getting Started Tutorial](docs/tutorials/getting_started.md).\n\n### 1. API Setup\nMLZero uses AWS Bedrock by default. Configure your AWS credentials:\n\n```bash\nexport AWS_DEFAULT_REGION=\"<your-region>\"\nexport AWS_ACCESS_KEY_ID=\"<your-access-key>\"\nexport AWS_SECRET_ACCESS_KEY=\"<your-secret-key>\"\n```\n\nWe also support Anthropic, Azure, and OpenAI. Support for more LLM providers (e.g. DeepSeek, etc.) will be added soon.\n\n### 2.1 CLI\n\n\n\n```bash\nmlzero -i <input_data_folder> [-t <optional_user_instructions>]\n```\n\n### 2.2 Web UI\n\n\n\n```bash\nmlzero-backend # command to start backend\nmlzero-frontend # command to start frontend on 8509(default)\n```\n\n1. **Configure**: Set your model provider and credentials in settings\n2. **Upload & Describe**: Drag your data folder into the chat input box, then type what you want to accomplish and press Enter\n\n### 2.3 MCP (Model Context Protocol)\n\nNote: The system can run on a single machine or distributed across multiple machines (e.g., server on EC2, client on local).\n1. **Start the server**\n```bash\ncd autogluon-assistant\nmlzero-backend # command to start backend\nmlzero-mcp-server # This will start the service\u2014run it in a new terminal.\n```\n2. **Start the client**\n```bash\ncd autogluon-assistant\nmlzero-mcp-client\n```\nNote: You may need to set up port tunneling to expose your local MCP Client Server (port 8005) if you want to use it with remote LLM services (e.g., Claude API, OpenAI API).\n\n### 2.4 Python API\n\n```python\nfrom autogluon.assistant.coding_agent import run_agent\nrun_agent(\n input_data_folder=<your-input-folder>,\n output_folder=<your-output-folder>,\n # more args ...\n)\n```\n\n## Citation\nIf you use Autogluon Assistant (MLZero) in your research, please cite our paper:\n\n```bibtex\n@misc{fang2025mlzeromultiagentendtoendmachine,\n title={MLZero: A Multi-Agent System for End-to-end Machine Learning Automation}, \n author={Haoyang Fang and Boran Han and Nick Erickson and Xiyuan Zhang and Su Zhou and Anirudh Dagar and Jiani Zhang and Ali Caner Turkmen and Cuixiong Hu and Huzefa Rangwala and Ying Nian Wu and Bernie Wang and George Karypis},\n year={2025},\n eprint={2505.13941},\n archivePrefix={arXiv},\n primaryClass={cs.MA},\n url={https://arxiv.org/abs/2505.13941}, \n}\n```\n",
"bugtrack_url": null,
"license": null,
"summary": "Multi Agent System for Multimodal ML Automation",
"version": "1.0.0",
"project_urls": {
"Homepage": "https://github.com/autogluon/autogluon-assistant"
},
"split_keywords": [],
"urls": [
{
"comment_text": null,
"digests": {
"blake2b_256": "846df55044c256392bc9fa98176020afd7565d2e6bb2bd01bdef7d7a3a8e043e",
"md5": "9e120ca94eaf817b25f3b311c4eea7d9",
"sha256": "42b91acd5872d81b8a5071a21a41e9018408e46076f6b0be2db9d88e9a5e491a"
},
"downloads": -1,
"filename": "autogluon_assistant-1.0.0-py3-none-any.whl",
"has_sig": false,
"md5_digest": "9e120ca94eaf817b25f3b311c4eea7d9",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.12,>=3.8",
"size": 2198880,
"upload_time": "2025-07-11T19:19:47",
"upload_time_iso_8601": "2025-07-11T19:19:47.373006Z",
"url": "https://files.pythonhosted.org/packages/84/6d/f55044c256392bc9fa98176020afd7565d2e6bb2bd01bdef7d7a3a8e043e/autogluon_assistant-1.0.0-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": null,
"digests": {
"blake2b_256": "c998e3fe520b455e1de54742a39d70957e8660ad381052402ed91782ad2bcf6b",
"md5": "a18db330a5620a7128e56cc19a8b99b4",
"sha256": "d22bc743409bd3b814d49d7a3c9e8fdbfb6f2278cbf8bf850db37846ff166c4c"
},
"downloads": -1,
"filename": "autogluon_assistant-1.0.0.tar.gz",
"has_sig": false,
"md5_digest": "a18db330a5620a7128e56cc19a8b99b4",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.12,>=3.8",
"size": 2012647,
"upload_time": "2025-07-11T19:19:49",
"upload_time_iso_8601": "2025-07-11T19:19:49.590575Z",
"url": "https://files.pythonhosted.org/packages/c9/98/e3fe520b455e1de54742a39d70957e8660ad381052402ed91782ad2bcf6b/autogluon_assistant-1.0.0.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2025-07-11 19:19:49",
"github": true,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"github_user": "autogluon",
"github_project": "autogluon-assistant",
"travis_ci": false,
"coveralls": false,
"github_actions": true,
"lcname": "autogluon.assistant"
}