swarm-models


Nameswarm-models JSON
Version 0.3.0 PyPI version JSON
download
home_pagehttps://github.com/The-Swarm-Corporation/swarm-models
SummarySwarm Models - TGSC
upload_time2024-12-30 05:29:33
maintainerNone
docs_urlNone
authorKye Gomez
requires_python<4.0,>=3.10
licenseMIT
keywords artificial intelligence deep learning optimizers prompt engineering natural language processing machine learning pytorch transformers
VCS
bugtrack_url
requirements torch transformers diffusers loguru pydantic langchain-community together litellm ollama
Travis-CI No Travis.
coveralls test coverage No coveralls.
            
# Swarm Models

[![Join our Discord](https://img.shields.io/badge/Discord-Join%20our%20server-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/agora-999382051935506503) [![Subscribe on YouTube](https://img.shields.io/badge/YouTube-Subscribe-red?style=for-the-badge&logo=youtube&logoColor=white)](https://www.youtube.com/@kyegomez3242) [![Connect on LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue?style=for-the-badge&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/kye-g-38759a207/) [![Follow on X.com](https://img.shields.io/badge/X.com-Follow-1DA1F2?style=for-the-badge&logo=x&logoColor=white)](https://x.com/kyegomezb)

Swarm Models provides a unified, secure, and highly scalable interface for interacting with multiple LLM and multi-modal APIs across different providers. It is built to streamline your API integrations, ensuring production-grade reliability and robust performance.

## **Key Features**:

- **Multi-Provider Support**: Integrate seamlessly with APIs from OpenAI, Anthropic, Azure, and more.
  
- **Enterprise-Grade Security**: Built-in security protocols to protect your API keys and sensitive data, ensuring compliance with industry standards.

- **Lightning-Fast Performance**: Optimized for low-latency and high-throughput, Swarm Models delivers blazing-fast API responses, suitable for real-time applications.

- **Ease of Use**: Simplified API interaction with intuitive `.run(task)` and `__call__` methods, making integration effortless.

- **Scalability for All Use Cases**: Whether it's a small script or a massive enterprise-scale application, Swarm Models scales effortlessly.

- **Production-Grade Reliability**: Tested and proven in enterprise environments, ensuring consistent uptime and failover capabilities.

---


## **Onboarding**

Swarm Models simplifies the way you interact with different APIs by providing a unified interface for all models.

### **1. Install Swarm Models**

```bash
$ pip3 install -U swarm-models
```

### **2. Set Your Keys**

```bash
OPENAI_API_KEY="your_openai_api_key"
GROQ_API_KEY="your_groq_api_key"
ANTHROPIC_API_KEY="your_anthropic_api_key"
AZURE_OPENAI_API_KEY="your_azure_openai_api_key"
```

### **3. Initialize a Model**

Import the desired model from the package and initialize it with your API key or necessary configuration.

```python
from swarm_models import YourDesiredModel

model = YourDesiredModel(api_key='your_api_key', *args, **kwargs)
```

### **4. Run Your Task**

Use the `.run(task)` method or simply call the model like `model(task)` with your task.

```python
task = "Define your task here"
result = model.run(task)

# Or equivalently
#result = model(task)
```

### **5. Enjoy the Results**

```python
print(result)
```

---

## **Full Code Example**

```python
from swarm_models import OpenAIChat
import os

# Get the OpenAI API key from the environment variable
api_key = os.getenv("OPENAI_API_KEY")

# Create an instance of the OpenAIChat class
model = OpenAIChat(openai_api_key=api_key, model_name="gpt-4o-mini")

# Query the model with a question
out = model(
   "What is the best state to register a business in the US for the least amount of taxes?"
)

# Print the model's response
print(out)
```

---

## `TogetherLLM` Documentation

The `TogetherLLM` class is designed to simplify the interaction with Together's LLM models. It provides a straightforward way to run tasks on these models, including support for concurrent and batch processing.

### Initialization

To use `TogetherLLM`, you need to initialize it with your API key, the name of the model you want to use, and optionally, a system prompt. The system prompt is used to provide context to the model for the tasks you will run.

Here's an example of how to initialize `TogetherLLM`:
```python
import os
from swarm_models import TogetherLLM

model_runner = TogetherLLM(
    api_key=os.environ.get("TOGETHER_API_KEY"),
    model_name="meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo",
    system_prompt="You're Larry fink",
)
```
### Running Tasks

Once initialized, you can run tasks on the model using the `run` method. This method takes a task string as an argument and returns the response from the model.

Here's an example of running a single task:
```python
task = "How do we allocate capital efficiently in your opinion Larry?"
response = model_runner.run(task)
print(response)
```
### Running Multiple Tasks Concurrently

`TogetherLLM` also supports running multiple tasks concurrently using the `run_concurrently` method. This method takes a list of task strings and returns a list of responses from the model.

Here's an example of running multiple tasks concurrently:
```python
tasks = [
    "What are the top-performing mutual funds in the last quarter?",
    "How do I evaluate the risk of a mutual fund?",
    "What are the fees associated with investing in a mutual fund?",
    "Can you recommend a mutual fund for a beginner investor?",
    "How do I diversify my portfolio with mutual funds?",
]
responses = model_runner.run_concurrently(tasks)
for response in responses:
    print(response)
```


## **Enterprise-Grade Features**

1. **Security**: API keys and user data are handled with utmost care, utilizing encryption and best security practices to protect your sensitive information.
   
2. **Production Reliability**: Swarm Models has undergone rigorous testing to ensure that it can handle high traffic and remains resilient in enterprise-grade environments.

3. **Fail-Safe Mechanisms**: Built-in failover handling to ensure uninterrupted service even under heavy load or network issues.

4. **Unified API**: No more dealing with multiple SDKs or libraries. Swarm Models standardizes your interactions across providers like OpenAI, Anthropic, Azure, and more, so you can focus on what matters.

---

## **Available Models**

| Model Name                | Import Path                                           |
|---------------------------|------------------------------------------------------|
| BaseLLM                   | `from swarm_models.base_llm import BaseLLM`         |
| BaseMultiModalModel       | `from swarm_models.base_multimodal_model import BaseMultiModalModel` |
| GPT4VisionAPI             | `from swarm_models.gpt4_vision_api import GPT4VisionAPI` |
| HuggingfaceLLM            | `from swarm_models.huggingface import HuggingfaceLLM` |
| LayoutLMDocumentQA        | `from swarm_models.layoutlm_document_qa import LayoutLMDocumentQA` |
| llama3Hosted              | `from swarm_models.llama3_hosted import llama3Hosted` |
| LavaMultiModal            | `from swarm_models.llava import LavaMultiModal`     |
| Nougat                    | `from swarm_models.nougat import Nougat`            |
| OpenAIEmbeddings          | `from swarm_models.openai_embeddings import OpenAIEmbeddings` |
| OpenAITTS                 | `from swarm_models.openai_tts import OpenAITTS`     |
| GooglePalm                | `from swarm_models.palm import GooglePalm as Palm`  |
| Anthropic                 | `from swarm_models.popular_llms import Anthropic as Anthropic` |
| AzureOpenAI               | `from swarm_models.popular_llms import AzureOpenAILLM as AzureOpenAI` |
| Cohere                    | `from swarm_models.popular_llms import CohereChat as Cohere` |
| OctoAIChat                | `from swarm_models.popular_llms import OctoAIChat`  |
| OpenAIChat                | `from swarm_models.popular_llms import OpenAIChatLLM as OpenAIChat` |
| OpenAILLM                 | `from swarm_models.popular_llms import OpenAILLM as OpenAI` |
| Replicate                 | `from swarm_models.popular_llms import ReplicateChat as Replicate` |
| QwenVLMultiModal          | `from swarm_models.qwen import QwenVLMultiModal`    |
| FireWorksAI               | `from swarm_models.popular_llms import FireWorksAI`  |
| Vilt                      | `from swarm_models.vilt import Vilt`                  |
| TogetherLLM               | `from swarm_models.together_llm import TogetherLLM`  |
| LiteLLM              | `from swarm_models.lite_llm_model import LiteLLM` |
| OpenAIFunctionCaller      | `from swarm_models.openai_function_caller import OpenAIFunctionCaller` |
| OllamaModel               | `from swarm_models.ollama_model import OllamaModel`   |
| GroundedSAMTwo            | `from swarm_models.sam_two import GroundedSAMTwo`     |


---

## **Support & Contributions**

- **Documentation**: Comprehensive guides, API references, and best practices are available in our official [Documentation](https://docs.swarms.world).
- **GitHub**: Explore the code, report issues, and contribute to the project via our [GitHub repository](https://github.com/The-Swarm-Corporation/swarm-models).

---

## **License**

Swarm Models is released under the [MIT License](https://github.com/The-Swarm-Corporation/swarm-models/LICENSE).

---


# Todo

- [ ] Add cohere models command r
- [ ] Add gemini and google ai studio
- [ ] Integrate ollama extensively

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/The-Swarm-Corporation/swarm-models",
    "name": "swarm-models",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": "artificial intelligence, deep learning, optimizers, Prompt Engineering, natural language processing, machine learning, pytorch, transformers",
    "author": "Kye Gomez",
    "author_email": "kye@apac.ai",
    "download_url": "https://files.pythonhosted.org/packages/db/6f/4b481b149be93531132177a0d5eb4a44224ecabe20e83d8e8e24003aefd5/swarm_models-0.3.0.tar.gz",
    "platform": null,
    "description": "\n# Swarm Models\n\n[![Join our Discord](https://img.shields.io/badge/Discord-Join%20our%20server-5865F2?style=for-the-badge&logo=discord&logoColor=white)](https://discord.gg/agora-999382051935506503) [![Subscribe on YouTube](https://img.shields.io/badge/YouTube-Subscribe-red?style=for-the-badge&logo=youtube&logoColor=white)](https://www.youtube.com/@kyegomez3242) [![Connect on LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue?style=for-the-badge&logo=linkedin&logoColor=white)](https://www.linkedin.com/in/kye-g-38759a207/) [![Follow on X.com](https://img.shields.io/badge/X.com-Follow-1DA1F2?style=for-the-badge&logo=x&logoColor=white)](https://x.com/kyegomezb)\n\nSwarm Models provides a unified, secure, and highly scalable interface for interacting with multiple LLM and multi-modal APIs across different providers. It is built to streamline your API integrations, ensuring production-grade reliability and robust performance.\n\n## **Key Features**:\n\n- **Multi-Provider Support**: Integrate seamlessly with APIs from OpenAI, Anthropic, Azure, and more.\n  \n- **Enterprise-Grade Security**: Built-in security protocols to protect your API keys and sensitive data, ensuring compliance with industry standards.\n\n- **Lightning-Fast Performance**: Optimized for low-latency and high-throughput, Swarm Models delivers blazing-fast API responses, suitable for real-time applications.\n\n- **Ease of Use**: Simplified API interaction with intuitive `.run(task)` and `__call__` methods, making integration effortless.\n\n- **Scalability for All Use Cases**: Whether it's a small script or a massive enterprise-scale application, Swarm Models scales effortlessly.\n\n- **Production-Grade Reliability**: Tested and proven in enterprise environments, ensuring consistent uptime and failover capabilities.\n\n---\n\n\n## **Onboarding**\n\nSwarm Models simplifies the way you interact with different APIs by providing a unified interface for all models.\n\n### **1. Install Swarm Models**\n\n```bash\n$ pip3 install -U swarm-models\n```\n\n### **2. Set Your Keys**\n\n```bash\nOPENAI_API_KEY=\"your_openai_api_key\"\nGROQ_API_KEY=\"your_groq_api_key\"\nANTHROPIC_API_KEY=\"your_anthropic_api_key\"\nAZURE_OPENAI_API_KEY=\"your_azure_openai_api_key\"\n```\n\n### **3. Initialize a Model**\n\nImport the desired model from the package and initialize it with your API key or necessary configuration.\n\n```python\nfrom swarm_models import YourDesiredModel\n\nmodel = YourDesiredModel(api_key='your_api_key', *args, **kwargs)\n```\n\n### **4. Run Your Task**\n\nUse the `.run(task)` method or simply call the model like `model(task)` with your task.\n\n```python\ntask = \"Define your task here\"\nresult = model.run(task)\n\n# Or equivalently\n#result = model(task)\n```\n\n### **5. Enjoy the Results**\n\n```python\nprint(result)\n```\n\n---\n\n## **Full Code Example**\n\n```python\nfrom swarm_models import OpenAIChat\nimport os\n\n# Get the OpenAI API key from the environment variable\napi_key = os.getenv(\"OPENAI_API_KEY\")\n\n# Create an instance of the OpenAIChat class\nmodel = OpenAIChat(openai_api_key=api_key, model_name=\"gpt-4o-mini\")\n\n# Query the model with a question\nout = model(\n   \"What is the best state to register a business in the US for the least amount of taxes?\"\n)\n\n# Print the model's response\nprint(out)\n```\n\n---\n\n## `TogetherLLM` Documentation\n\nThe `TogetherLLM` class is designed to simplify the interaction with Together's LLM models. It provides a straightforward way to run tasks on these models, including support for concurrent and batch processing.\n\n### Initialization\n\nTo use `TogetherLLM`, you need to initialize it with your API key, the name of the model you want to use, and optionally, a system prompt. The system prompt is used to provide context to the model for the tasks you will run.\n\nHere's an example of how to initialize `TogetherLLM`:\n```python\nimport os\nfrom swarm_models import TogetherLLM\n\nmodel_runner = TogetherLLM(\n    api_key=os.environ.get(\"TOGETHER_API_KEY\"),\n    model_name=\"meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo\",\n    system_prompt=\"You're Larry fink\",\n)\n```\n### Running Tasks\n\nOnce initialized, you can run tasks on the model using the `run` method. This method takes a task string as an argument and returns the response from the model.\n\nHere's an example of running a single task:\n```python\ntask = \"How do we allocate capital efficiently in your opinion Larry?\"\nresponse = model_runner.run(task)\nprint(response)\n```\n### Running Multiple Tasks Concurrently\n\n`TogetherLLM` also supports running multiple tasks concurrently using the `run_concurrently` method. This method takes a list of task strings and returns a list of responses from the model.\n\nHere's an example of running multiple tasks concurrently:\n```python\ntasks = [\n    \"What are the top-performing mutual funds in the last quarter?\",\n    \"How do I evaluate the risk of a mutual fund?\",\n    \"What are the fees associated with investing in a mutual fund?\",\n    \"Can you recommend a mutual fund for a beginner investor?\",\n    \"How do I diversify my portfolio with mutual funds?\",\n]\nresponses = model_runner.run_concurrently(tasks)\nfor response in responses:\n    print(response)\n```\n\n\n## **Enterprise-Grade Features**\n\n1. **Security**: API keys and user data are handled with utmost care, utilizing encryption and best security practices to protect your sensitive information.\n   \n2. **Production Reliability**: Swarm Models has undergone rigorous testing to ensure that it can handle high traffic and remains resilient in enterprise-grade environments.\n\n3. **Fail-Safe Mechanisms**: Built-in failover handling to ensure uninterrupted service even under heavy load or network issues.\n\n4. **Unified API**: No more dealing with multiple SDKs or libraries. Swarm Models standardizes your interactions across providers like OpenAI, Anthropic, Azure, and more, so you can focus on what matters.\n\n---\n\n## **Available Models**\n\n| Model Name                | Import Path                                           |\n|---------------------------|------------------------------------------------------|\n| BaseLLM                   | `from swarm_models.base_llm import BaseLLM`         |\n| BaseMultiModalModel       | `from swarm_models.base_multimodal_model import BaseMultiModalModel` |\n| GPT4VisionAPI             | `from swarm_models.gpt4_vision_api import GPT4VisionAPI` |\n| HuggingfaceLLM            | `from swarm_models.huggingface import HuggingfaceLLM` |\n| LayoutLMDocumentQA        | `from swarm_models.layoutlm_document_qa import LayoutLMDocumentQA` |\n| llama3Hosted              | `from swarm_models.llama3_hosted import llama3Hosted` |\n| LavaMultiModal            | `from swarm_models.llava import LavaMultiModal`     |\n| Nougat                    | `from swarm_models.nougat import Nougat`            |\n| OpenAIEmbeddings          | `from swarm_models.openai_embeddings import OpenAIEmbeddings` |\n| OpenAITTS                 | `from swarm_models.openai_tts import OpenAITTS`     |\n| GooglePalm                | `from swarm_models.palm import GooglePalm as Palm`  |\n| Anthropic                 | `from swarm_models.popular_llms import Anthropic as Anthropic` |\n| AzureOpenAI               | `from swarm_models.popular_llms import AzureOpenAILLM as AzureOpenAI` |\n| Cohere                    | `from swarm_models.popular_llms import CohereChat as Cohere` |\n| OctoAIChat                | `from swarm_models.popular_llms import OctoAIChat`  |\n| OpenAIChat                | `from swarm_models.popular_llms import OpenAIChatLLM as OpenAIChat` |\n| OpenAILLM                 | `from swarm_models.popular_llms import OpenAILLM as OpenAI` |\n| Replicate                 | `from swarm_models.popular_llms import ReplicateChat as Replicate` |\n| QwenVLMultiModal          | `from swarm_models.qwen import QwenVLMultiModal`    |\n| FireWorksAI               | `from swarm_models.popular_llms import FireWorksAI`  |\n| Vilt                      | `from swarm_models.vilt import Vilt`                  |\n| TogetherLLM               | `from swarm_models.together_llm import TogetherLLM`  |\n| LiteLLM              | `from swarm_models.lite_llm_model import LiteLLM` |\n| OpenAIFunctionCaller      | `from swarm_models.openai_function_caller import OpenAIFunctionCaller` |\n| OllamaModel               | `from swarm_models.ollama_model import OllamaModel`   |\n| GroundedSAMTwo            | `from swarm_models.sam_two import GroundedSAMTwo`     |\n\n\n---\n\n## **Support & Contributions**\n\n- **Documentation**: Comprehensive guides, API references, and best practices are available in our official [Documentation](https://docs.swarms.world).\n- **GitHub**: Explore the code, report issues, and contribute to the project via our [GitHub repository](https://github.com/The-Swarm-Corporation/swarm-models).\n\n---\n\n## **License**\n\nSwarm Models is released under the [MIT License](https://github.com/The-Swarm-Corporation/swarm-models/LICENSE).\n\n---\n\n\n# Todo\n\n- [ ] Add cohere models command r\n- [ ] Add gemini and google ai studio\n- [ ] Integrate ollama extensively\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Swarm Models - TGSC",
    "version": "0.3.0",
    "project_urls": {
        "Documentation": "https://github.com/The-Swarm-Corporation/swarm-models",
        "Homepage": "https://github.com/The-Swarm-Corporation/swarm-models",
        "Repository": "https://github.com/The-Swarm-Corporation/swarm-models"
    },
    "split_keywords": [
        "artificial intelligence",
        " deep learning",
        " optimizers",
        " prompt engineering",
        " natural language processing",
        " machine learning",
        " pytorch",
        " transformers"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "a70f5d1d55c75739b75b8000a03c81ff3967ed71261229504137e16d20d027ba",
                "md5": "827b8bffd895ffcc22d9f3dc9e6d9070",
                "sha256": "c7cfc34a49e1072dfcd21e71c3b0ad947a1f0fc977cfebbbf5c1b6cdb187d17a"
            },
            "downloads": -1,
            "filename": "swarm_models-0.3.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "827b8bffd895ffcc22d9f3dc9e6d9070",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 73261,
            "upload_time": "2024-12-30T05:29:30",
            "upload_time_iso_8601": "2024-12-30T05:29:30.336564Z",
            "url": "https://files.pythonhosted.org/packages/a7/0f/5d1d55c75739b75b8000a03c81ff3967ed71261229504137e16d20d027ba/swarm_models-0.3.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "db6f4b481b149be93531132177a0d5eb4a44224ecabe20e83d8e8e24003aefd5",
                "md5": "a7ae9b4778f32c2961f285204a0df4d6",
                "sha256": "64eec72903dcf353094b6f7fa603d4d714196a1eb3dee28cc1e35902e5e5f80a"
            },
            "downloads": -1,
            "filename": "swarm_models-0.3.0.tar.gz",
            "has_sig": false,
            "md5_digest": "a7ae9b4778f32c2961f285204a0df4d6",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 57232,
            "upload_time": "2024-12-30T05:29:33",
            "upload_time_iso_8601": "2024-12-30T05:29:33.711770Z",
            "url": "https://files.pythonhosted.org/packages/db/6f/4b481b149be93531132177a0d5eb4a44224ecabe20e83d8e8e24003aefd5/swarm_models-0.3.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-30 05:29:33",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "The-Swarm-Corporation",
    "github_project": "swarm-models",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "requirements": [
        {
            "name": "torch",
            "specs": []
        },
        {
            "name": "transformers",
            "specs": []
        },
        {
            "name": "diffusers",
            "specs": []
        },
        {
            "name": "loguru",
            "specs": []
        },
        {
            "name": "pydantic",
            "specs": []
        },
        {
            "name": "langchain-community",
            "specs": [
                [
                    "==",
                    "\"0.0.29\""
                ]
            ]
        },
        {
            "name": "together",
            "specs": []
        },
        {
            "name": "litellm",
            "specs": []
        },
        {
            "name": "ollama",
            "specs": []
        }
    ],
    "lcname": "swarm-models"
}
        
Elapsed time: 0.40536s