# LLM Agentic Tool Mesh
Welcome to LLM Agentic Tool Mesh, a pioneering initiative by HPE Athonet aimed at democratizing Generative Artificial Intelligence (Gen AI). Our vision is to make Gen AI accessible and beneficial to a broader audience, enabling users from various backgrounds to leverage cutting-edge Gen AI technology effortlessly.
## Understanding the Challenges
Gen AI has the potential to revolutionize businesses, but adopting it comes with challenges:
- **Technical Complexity**: Gen AI tools are powerful but often require both coding and machine learning expertise. This makes it difficult for companies to use these tools effectively without specialized skills.
- **Organizational Challenges**: Simply adding a Gen AI team isn’t enough. The real value comes from using the knowledge of your existing teams, especially those who may not be tech experts. However, if not done right, Gen AI can impact team dynamics. It’s important to find ways to use Gen AI that enhance collaboration and make the most of everyone’s expertise.
## Our Approach
LLM Agentic Tool Mesh empowers users to create tools and web applications using Gen AI with Low or No Coding. This approach addresses the technical challenges by simplifying the integration process. By leveraging the Pareto principle, LLM Agentic Tool Mesh focuses on the 20% of features that cover 80% of user needs. This is achieved by abstracting complex, low-level libraries into easy-to-understand services that are accessible even to non-developers, effectively hiding the underlying complexity.
This simplicity not only helps technical teams but also enables non-technical teams to develop tools related to their domain expertise. The platform then allows for the creation of a "Mesh" of these Gen AI tools, providing orchestration capabilities through an agentic Reasoning Engine based on Large Language Models (LLMs). This orchestration ensures that all tools work together seamlessly, enhancing overall functionality and efficiency across the organization.
## Quick Start
We have created a series of tools and examples to demonstrate what you can do with LLM Agentic Tool Mesh. To get started, follow these steps to set up your environment, understand the project structure, and run the tools and web applications provided.
### Folder Structure
The project is organized into the following directories:
- **self_serve_platform**: Contains all self-serve platform services for creating tools and web applications. These services are grouped into:
- **Chat Services**
- **RAG (Retrieval-Augmented Generation) Services**
- **Agent Services**
- **System Platform Services**
- **examples**: Includes four Gen AI tools based on LLMs that demonstrate various capabilities:
- **Tool Examples**: Demonstrates how to call an API, improve text, generate code, retrieve information from documents using RAG, and use a multi-agent system to solve complex tasks.
- **Web Applications**:
- A chatbot that orchestrates all these tools.
- An agentic memory for sharing chat messages among different users.
- A back panel that allows configuring a tool via a user interface.
- **federated_governance**: Contains a set of governance policies and standards to ensure consistency, ethical adherence, and quality across all tools.
### Prerequisites
Before setting up the LLM Agentic Tool Mesh platform, please ensure the following prerequisites are met:
#### General Requirements
- **Python 3.11**: Ensure Python 3.11 is installed on your machine.
- **API Key**: Set your ChatGPT API key by assigning it to the `OPENAI_API_KEY` environment variable.
### Installation Options
#### Option 1: Install LLM Agentic Tool Mesh Services Only
If you only need the core LLM Agentic Tool Mesh services without the example applications, you can install them directly via `pip`:
```bash
pip install 'llmesh[all]'
```
After installation, refer to the [Usage Guide](https://github.com/HewlettPackard/llmesh/wiki/Usage#using-library-services) for instructions on using platform services.
#### Option 2: Full Example Setup
To use the complete setup, including examples and demo applications, follow these steps:
1. **Clone the Repository**: Download the LLM Agentic Tool Mesh repository to your local machine.
```bash
git clone https://github.com/HewlettPackard/llmesh.git
cd llmesh
```
2. **Install Dependencies**: All dependencies required by the platform are specified in the `pyproject.toml` file. Use the following commands to install them:
```bash
pip install poetry
poetry install --all-extras
```
3. **Setup for Specific Tools**: Some tools, including **tool_rag**, **tool_agents**, and **tool_analyzer**, require additional setup (e.g., copying specific data files and initializing configurations). For detailed setup instructions, refer to the [Installation Guide](https://github.com/HewlettPackard/llmesh/wiki/Installation).
### Running the UIs
You can run the tools and web applications individually or use the provided script `run_examples.sh` to run them all together. Once everything is started, you can access the chatbot app at [https://127.0.0.1:5001/](https://127.0.0.1:5001/) and the back panel at [https://127.0.0.1:5011/](https://127.0.0.1:5011/). Have fun :) !!!
## References
For more details about installation, usage, and advanced configurations, please visit the [LLM Agentic Tool Mesh project Wiki](https://github.com/HewlettPackard/llmesh/wiki).
## Contact
If you have any questions or need further assistance, feel free to contact me at <antonio.fin@hpe.com>.
Raw data
{
"_id": null,
"home_page": "https://github.hpe.com/antonio-fin/athon/wiki",
"name": "llmesh",
"maintainer": null,
"docs_url": null,
"requires_python": "<3.13,>=3.11",
"maintainer_email": null,
"keywords": "HPE, Athon, LLM Agentic Tool Mesh, LLM Agentic Tool Platform",
"author": "Antonio Fin",
"author_email": "antonio.fin@hpe.com",
"download_url": "https://files.pythonhosted.org/packages/88/8c/705bb1bd0c582dc6b41fe27c8543a4dfcae31285cad9e3467100c58897c6/llmesh-0.1.2.tar.gz",
"platform": null,
"description": "# LLM Agentic Tool Mesh\n\nWelcome to LLM Agentic Tool Mesh, a pioneering initiative by HPE Athonet aimed at democratizing Generative Artificial Intelligence (Gen AI). Our vision is to make Gen AI accessible and beneficial to a broader audience, enabling users from various backgrounds to leverage cutting-edge Gen AI technology effortlessly.\n\n## Understanding the Challenges\n\nGen AI has the potential to revolutionize businesses, but adopting it comes with challenges:\n\n- **Technical Complexity**: Gen AI tools are powerful but often require both coding and machine learning expertise. This makes it difficult for companies to use these tools effectively without specialized skills.\n- **Organizational Challenges**: Simply adding a Gen AI team isn\u2019t enough. The real value comes from using the knowledge of your existing teams, especially those who may not be tech experts. However, if not done right, Gen AI can impact team dynamics. It\u2019s important to find ways to use Gen AI that enhance collaboration and make the most of everyone\u2019s expertise.\n\n## Our Approach\n\nLLM Agentic Tool Mesh empowers users to create tools and web applications using Gen AI with Low or No Coding. This approach addresses the technical challenges by simplifying the integration process. By leveraging the Pareto principle, LLM Agentic Tool Mesh focuses on the 20% of features that cover 80% of user needs. This is achieved by abstracting complex, low-level libraries into easy-to-understand services that are accessible even to non-developers, effectively hiding the underlying complexity.\n\nThis simplicity not only helps technical teams but also enables non-technical teams to develop tools related to their domain expertise. The platform then allows for the creation of a \"Mesh\" of these Gen AI tools, providing orchestration capabilities through an agentic Reasoning Engine based on Large Language Models (LLMs). This orchestration ensures that all tools work together seamlessly, enhancing overall functionality and efficiency across the organization.\n\n## Quick Start\n\nWe have created a series of tools and examples to demonstrate what you can do with LLM Agentic Tool Mesh. To get started, follow these steps to set up your environment, understand the project structure, and run the tools and web applications provided.\n\n### Folder Structure\n\nThe project is organized into the following directories:\n\n- **self_serve_platform**: Contains all self-serve platform services for creating tools and web applications. These services are grouped into:\n - **Chat Services**\n - **RAG (Retrieval-Augmented Generation) Services**\n - **Agent Services**\n - **System Platform Services**\n- **examples**: Includes four Gen AI tools based on LLMs that demonstrate various capabilities:\n - **Tool Examples**: Demonstrates how to call an API, improve text, generate code, retrieve information from documents using RAG, and use a multi-agent system to solve complex tasks.\n - **Web Applications**:\n - A chatbot that orchestrates all these tools.\n - An agentic memory for sharing chat messages among different users.\n - A back panel that allows configuring a tool via a user interface.\n- **federated_governance**: Contains a set of governance policies and standards to ensure consistency, ethical adherence, and quality across all tools.\n\n### Prerequisites\n\nBefore setting up the LLM Agentic Tool Mesh platform, please ensure the following prerequisites are met:\n\n#### General Requirements\n\n- **Python 3.11**: Ensure Python 3.11 is installed on your machine.\n- **API Key**: Set your ChatGPT API key by assigning it to the `OPENAI_API_KEY` environment variable.\n\n### Installation Options\n\n#### Option 1: Install LLM Agentic Tool Mesh Services Only\n\nIf you only need the core LLM Agentic Tool Mesh services without the example applications, you can install them directly via `pip`:\n\n ```bash\n pip install 'llmesh[all]'\n ```\n\nAfter installation, refer to the [Usage Guide](https://github.com/HewlettPackard/llmesh/wiki/Usage#using-library-services) for instructions on using platform services.\n\n#### Option 2: Full Example Setup\n\nTo use the complete setup, including examples and demo applications, follow these steps:\n\n1. **Clone the Repository**: Download the LLM Agentic Tool Mesh repository to your local machine.\n\n ```bash\n git clone https://github.com/HewlettPackard/llmesh.git\n cd llmesh\n ```\n\n2. **Install Dependencies**: All dependencies required by the platform are specified in the `pyproject.toml` file. Use the following commands to install them:\n\n ```bash\n pip install poetry\n poetry install --all-extras\n ```\n\n3. **Setup for Specific Tools**: Some tools, including **tool_rag**, **tool_agents**, and **tool_analyzer**, require additional setup (e.g., copying specific data files and initializing configurations). For detailed setup instructions, refer to the [Installation Guide](https://github.com/HewlettPackard/llmesh/wiki/Installation).\n\n### Running the UIs\n\nYou can run the tools and web applications individually or use the provided script `run_examples.sh` to run them all together. Once everything is started, you can access the chatbot app at [https://127.0.0.1:5001/](https://127.0.0.1:5001/) and the back panel at [https://127.0.0.1:5011/](https://127.0.0.1:5011/). Have fun :) !!!\n\n## References\n\nFor more details about installation, usage, and advanced configurations, please visit the [LLM Agentic Tool Mesh project Wiki](https://github.com/HewlettPackard/llmesh/wiki).\n\n## Contact\n\nIf you have any questions or need further assistance, feel free to contact me at <antonio.fin@hpe.com>.\n",
"bugtrack_url": null,
"license": "MIT",
"summary": "HPE Athonet LLM Platform is an innovative platform designed to streamline and enhance the use of AI in various applications. It serves as a central hub to orchestrate 'Intelligent Plugins,' optimizing AI interactions and processes.",
"version": "0.1.2",
"project_urls": {
"Homepage": "https://github.hpe.com/antonio-fin/athon/wiki",
"Repository": "https://github.hpe.com/antonio-fin/athon"
},
"split_keywords": [
"hpe",
" athon",
" llm agentic tool mesh",
" llm agentic tool platform"
],
"urls": [
{
"comment_text": "",
"digests": {
"blake2b_256": "6a01fb98abb3c83b2bdc93124cd1ac2b084d1c8795960699141d4f308afa9154",
"md5": "0589f00dda9d37b229194a2371cf4e54",
"sha256": "2834a4324e6385c0b0b7f70f4a49cc7b481c4fc0c5148e17ae20d6a07473f82f"
},
"downloads": -1,
"filename": "llmesh-0.1.2-py3-none-any.whl",
"has_sig": false,
"md5_digest": "0589f00dda9d37b229194a2371cf4e54",
"packagetype": "bdist_wheel",
"python_version": "py3",
"requires_python": "<3.13,>=3.11",
"size": 105715,
"upload_time": "2024-12-19T20:07:08",
"upload_time_iso_8601": "2024-12-19T20:07:08.541539Z",
"url": "https://files.pythonhosted.org/packages/6a/01/fb98abb3c83b2bdc93124cd1ac2b084d1c8795960699141d4f308afa9154/llmesh-0.1.2-py3-none-any.whl",
"yanked": false,
"yanked_reason": null
},
{
"comment_text": "",
"digests": {
"blake2b_256": "888c705bb1bd0c582dc6b41fe27c8543a4dfcae31285cad9e3467100c58897c6",
"md5": "b7b723dd26300a181c2bf4164e8d4c1d",
"sha256": "6cd6f70c692ba1b4f3fde840a310a15656db829947d6499120fea4df247c176e"
},
"downloads": -1,
"filename": "llmesh-0.1.2.tar.gz",
"has_sig": false,
"md5_digest": "b7b723dd26300a181c2bf4164e8d4c1d",
"packagetype": "sdist",
"python_version": "source",
"requires_python": "<3.13,>=3.11",
"size": 56775,
"upload_time": "2024-12-19T20:07:11",
"upload_time_iso_8601": "2024-12-19T20:07:11.637961Z",
"url": "https://files.pythonhosted.org/packages/88/8c/705bb1bd0c582dc6b41fe27c8543a4dfcae31285cad9e3467100c58897c6/llmesh-0.1.2.tar.gz",
"yanked": false,
"yanked_reason": null
}
],
"upload_time": "2024-12-19 20:07:11",
"github": false,
"gitlab": false,
"bitbucket": false,
"codeberg": false,
"lcname": "llmesh"
}