azllm


Nameazllm JSON
Version 0.1.6 PyPI version JSON
download
home_pagehttps://hanifsajid.github.io/azllm
SummaryA Python package that provides an easier user interface for multiple LLM providers.
upload_time2025-07-31 01:06:46
maintainerNone
docs_urlNone
authorHanif Sajid
requires_python>=3.11
licenseMIT
keywords llm openai ollama grok anthropic deepseek local gemini parallel batch text generation
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # azllm: A Unified LLM Interface for Multi-Provider Access

[![PyPI version](https://img.shields.io/pypi/v/azllm)](https://pypi.org/project/azllm/)
[![DOI](https://zenodo.org/badge/972978252.svg)](https://doi.org/10.5281/zenodo.15299641)
[![Python](https://img.shields.io/pypi/pyversions/azllm)](https://www.python.org/)


`azllm` is a Python package that provides a unified interface to work with multiple LLM providers including OpenAI, DeepSeek, Grok, Gemini, Meta's LLaMA, Anthropic, Ollama, and more.

> NOTE: For advanced usage, see the `azllm` <a href="https://hanifsajid.github.io/azllm" target="_blank">documentation</a> and/or <a href="https://github.com/hanifsajid/azllm/tree/main/examples" target="_blank">examples</a>.
---
## Features

- One unified interface for all major LLM APIs
- Batch and parallel prompt generation
- Structured outputs (parsing) with <a href="https://docs.pydantic.dev/latest/" target="_blank"> Pydantic</a> for models that support parsed outputs natively  
- Structured outputs (parsing) with <a href="https://docs.pydantic.dev/latest/" target="_blank"> Pydantic</a> for DeepSeek and Anthropic  
- Per-model configurations and lazy initialization
- Clean error handling
- `.env`-based API key management
---

## Supported Clients

- <a href="https://platform.openai.com/docs/overview" target="_blank">OpenAI</a>
- <a href="https://api-docs.deepseek.com" target="_blank">DeepSeek</a>
- <a href="https://x.ai" target="_blank">Grok</a>
- <a href="https://www.anthropic.com/claude" target="_blank">Anthropic</a>
- <a href="https://fireworks.ai" target="_blank">Fireworks</a> for Meta's LLaMA and others.
- <a href="https://ai.google.dev/gemini-api/docs" target="_blank">Google's Gemini</a>
- <a href="https://ollama.com" target="_blank">Ollama</a>

**NOTE:**   If you would like to request support for additional LLMs, please open an issue on our <a href="https://github.com/hanifsajid/azllm/issues" target="_blank">GitHub page</a>.

## Installation

You can install the `azllm` package via pip:

```bash
pip install azllm
```

### Prerequisites

- Python 3.11+
- Create a `.env` file to store your API keys. For example:

    ```bash
    OPENAI_API_KEY=your_openai_api_key
    DEEPSEEK_API_KEY=your_deepseek_api_key
    XAI_API_KEY=your_xai_api_key
    GEMINI_API_KEY=your_gemini_api_key
    ANTHROPIC_API_KEY=your_anthropic_api_key
    FIREWORKS_API_KEY=your_fireworks_api_key
    ```
- <a href="https://ollama.com" target="_blank">Ollama</a> must be installed and running locally to use Ollama models.

## Quick Start

### Basic Initialization

```Python
from azllm import azLLM
manager = azLLM()  # Instantiated with default parameters 
```

### Generate Text from a Single Prompt 

```Python
prompt = 'What is the captial of France?'
generated_text = manager.generate_text('openai', prompt)
print(generated_text)
```
### Batch Generation

Generate responses for multiple prompts at once:

```Python
batch_prompts = [
    'What is the capital of France?',
    'Tell me a joke.'
    ]

results = manager.batch_generate('openai', batch_prompts)
for result in results:
    print(result)
```
### Parallel Generation 

Run a single prompt across multiple models simultaneously:

```python
prompt = 'What is the capital of France?'
models = [
    'openai',
    'grok',
    'ollama']

results = manager.generate_parallel(prompt, models)
for model, result in results.items():
    print(f"Model: {model},\nResult: {result}\n")
```

## License

```md
MIT License
```

## Citation

```
@misc{azLLM,
  title        = {azllm},
  author       = {Hanif Sajid and Benjamin Radford and Yaoyao Dai and Jason Windett},
  year         = {2025},
  month        = apr,
  version      = {0.1.6},
  howpublished = {https://github.com/hanifsajid/azllm},
  note         = {MIT License},
  abstract     = {azllm is a Python package designed to interface with various large language models (LLMs) from different AI providers. It offers a unified interface for interacting with models from providers like OpenAI, DeepSeek, Grok, Gemini, Meta's Llama, Anthropic, Ollama, and others. The package allows for customizable configurations, batch generation, parallel generation, error handling, and the ability to parse structured responses from different models.}
}
```

            

Raw data

            {
    "_id": null,
    "home_page": "https://hanifsajid.github.io/azllm",
    "name": "azllm",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.11",
    "maintainer_email": null,
    "keywords": "llm, openai, ollama, grok, anthropic, deepseek, local, gemini, parallel, batch, text generation",
    "author": "Hanif Sajid",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/76/77/09e6325f16753f89d9db5f56a7f315df2d209f54b6e47046fb2b159e83ee/azllm-0.1.6.tar.gz",
    "platform": null,
    "description": "# azllm: A Unified LLM Interface for Multi-Provider Access\n\n[![PyPI version](https://img.shields.io/pypi/v/azllm)](https://pypi.org/project/azllm/)\n[![DOI](https://zenodo.org/badge/972978252.svg)](https://doi.org/10.5281/zenodo.15299641)\n[![Python](https://img.shields.io/pypi/pyversions/azllm)](https://www.python.org/)\n\n\n`azllm` is a Python package that provides a unified interface to work with multiple LLM providers including OpenAI, DeepSeek, Grok, Gemini, Meta's LLaMA, Anthropic, Ollama, and more.\n\n> NOTE: For advanced usage, see the `azllm` <a href=\"https://hanifsajid.github.io/azllm\" target=\"_blank\">documentation</a> and/or <a href=\"https://github.com/hanifsajid/azllm/tree/main/examples\" target=\"_blank\">examples</a>.\n---\n## Features\n\n- One unified interface for all major LLM APIs\n- Batch and parallel prompt generation\n- Structured outputs (parsing) with <a href=\"https://docs.pydantic.dev/latest/\" target=\"_blank\"> Pydantic</a> for models that support parsed outputs natively  \n- Structured outputs (parsing) with <a href=\"https://docs.pydantic.dev/latest/\" target=\"_blank\"> Pydantic</a> for DeepSeek and Anthropic  \n- Per-model configurations and lazy initialization\n- Clean error handling\n- `.env`-based API key management\n---\n\n## Supported Clients\n\n- <a href=\"https://platform.openai.com/docs/overview\" target=\"_blank\">OpenAI</a>\n- <a href=\"https://api-docs.deepseek.com\" target=\"_blank\">DeepSeek</a>\n- <a href=\"https://x.ai\" target=\"_blank\">Grok</a>\n- <a href=\"https://www.anthropic.com/claude\" target=\"_blank\">Anthropic</a>\n- <a href=\"https://fireworks.ai\" target=\"_blank\">Fireworks</a> for Meta's LLaMA and others.\n- <a href=\"https://ai.google.dev/gemini-api/docs\" target=\"_blank\">Google's Gemini</a>\n- <a href=\"https://ollama.com\" target=\"_blank\">Ollama</a>\n\n**NOTE:**   If you would like to request support for additional LLMs, please open an issue on our <a href=\"https://github.com/hanifsajid/azllm/issues\" target=\"_blank\">GitHub page</a>.\n\n## Installation\n\nYou can install the `azllm` package via pip:\n\n```bash\npip install azllm\n```\n\n### Prerequisites\n\n- Python 3.11+\n- Create a `.env` file to store your API keys. For example:\n\n    ```bash\n    OPENAI_API_KEY=your_openai_api_key\n    DEEPSEEK_API_KEY=your_deepseek_api_key\n    XAI_API_KEY=your_xai_api_key\n    GEMINI_API_KEY=your_gemini_api_key\n    ANTHROPIC_API_KEY=your_anthropic_api_key\n    FIREWORKS_API_KEY=your_fireworks_api_key\n    ```\n- <a href=\"https://ollama.com\" target=\"_blank\">Ollama</a> must be installed and running locally to use Ollama models.\n\n## Quick Start\n\n### Basic Initialization\n\n```Python\nfrom azllm import azLLM\nmanager = azLLM()  # Instantiated with default parameters \n```\n\n### Generate Text from a Single Prompt \n\n```Python\nprompt = 'What is the captial of France?'\ngenerated_text = manager.generate_text('openai', prompt)\nprint(generated_text)\n```\n### Batch Generation\n\nGenerate responses for multiple prompts at once:\n\n```Python\nbatch_prompts = [\n    'What is the capital of France?',\n    'Tell me a joke.'\n    ]\n\nresults = manager.batch_generate('openai', batch_prompts)\nfor result in results:\n    print(result)\n```\n### Parallel Generation \n\nRun a single prompt across multiple models simultaneously:\n\n```python\nprompt = 'What is the capital of France?'\nmodels = [\n    'openai',\n    'grok',\n    'ollama']\n\nresults = manager.generate_parallel(prompt, models)\nfor model, result in results.items():\n    print(f\"Model: {model},\\nResult: {result}\\n\")\n```\n\n## License\n\n```md\nMIT License\n```\n\n## Citation\n\n```\n@misc{azLLM,\n  title        = {azllm},\n  author       = {Hanif Sajid and Benjamin Radford and Yaoyao Dai and Jason Windett},\n  year         = {2025},\n  month        = apr,\n  version      = {0.1.6},\n  howpublished = {https://github.com/hanifsajid/azllm},\n  note         = {MIT License},\n  abstract     = {azllm is a Python package designed to interface with various large language models (LLMs) from different AI providers. It offers a unified interface for interacting with models from providers like OpenAI, DeepSeek, Grok, Gemini, Meta's Llama, Anthropic, Ollama, and others. The package allows for customizable configurations, batch generation, parallel generation, error handling, and the ability to parse structured responses from different models.}\n}\n```\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A Python package that provides an easier user interface for multiple LLM providers.",
    "version": "0.1.6",
    "project_urls": {
        "Homepage": "https://hanifsajid.github.io/azllm",
        "Repository": "https://github.com/hanifsajid/azllm",
        "doucmentation": "https://hanifsajid.github.io/azllm"
    },
    "split_keywords": [
        "llm",
        " openai",
        " ollama",
        " grok",
        " anthropic",
        " deepseek",
        " local",
        " gemini",
        " parallel",
        " batch",
        " text generation"
    ],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "87cf8fc05a50f5424c5217e9ee74c937e913fe748aea31631b4fe476fb84822d",
                "md5": "a4d5a80809efee2a9ac6dfef85a8acb6",
                "sha256": "eb7febfa4c7956a9607693776612f025d90a718bb9f5f72c0dbbd82b253d9fe5"
            },
            "downloads": -1,
            "filename": "azllm-0.1.6-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "a4d5a80809efee2a9ac6dfef85a8acb6",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.11",
            "size": 27834,
            "upload_time": "2025-07-31T01:06:45",
            "upload_time_iso_8601": "2025-07-31T01:06:45.188894Z",
            "url": "https://files.pythonhosted.org/packages/87/cf/8fc05a50f5424c5217e9ee74c937e913fe748aea31631b4fe476fb84822d/azllm-0.1.6-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "767709e6325f16753f89d9db5f56a7f315df2d209f54b6e47046fb2b159e83ee",
                "md5": "96c6e236f653500211c4fe2e28f87e0a",
                "sha256": "acca4a9ae78d575bdda47d0e0842b107cf7e39141aad687ba34087ee9e7bbd09"
            },
            "downloads": -1,
            "filename": "azllm-0.1.6.tar.gz",
            "has_sig": false,
            "md5_digest": "96c6e236f653500211c4fe2e28f87e0a",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.11",
            "size": 15458,
            "upload_time": "2025-07-31T01:06:46",
            "upload_time_iso_8601": "2025-07-31T01:06:46.003330Z",
            "url": "https://files.pythonhosted.org/packages/76/77/09e6325f16753f89d9db5f56a7f315df2d209f54b6e47046fb2b159e83ee/azllm-0.1.6.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-31 01:06:46",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "hanifsajid",
    "github_project": "azllm",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": true,
    "lcname": "azllm"
}
        
Elapsed time: 0.87496s