ailite


Nameailite JSON
Version 6.1.10 PyPI version JSON
download
home_pagehttps://github.com/santhosh/
SummaryAll popular Framework HF integration kit
upload_time2024-12-12 06:47:41
maintainerNone
docs_urlNone
authorKammari Santhosh
requires_python<4.0,>=3.10
licenseMIT
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AiLite 🚀

AiLite is a unified interface for accessing state-of-the-art language models through popular AI frameworks. It provides seamless integration with frameworks like DSPy, LangChain, AutoGen, and LlamaIndex while making advanced AI models accessible and free to use.

[![PyPI version](https://badge.fury.io/py/ailite.svg)](https://badge.fury.io/py/ailite)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/release/python-380/)

## 🌟 Features

- **Universal Framework Support**: Compatible with major AI frameworks including DSPy, LangChain, AutoGen, and LlamaIndex
- **Access to Leading Models**: Support for 30+ cutting-edge language models from providers like:
  - Qwen (72B, 32B models)
  - Meta's Llama family
  - Google's Gemma series
  - Mistral and Mixtral
  - Microsoft's Phi models
  - And many more!
- **Framework-Native Integration**: Use models with your favorite framework's native interfaces
- **Consistent API**: Uniform experience across different frameworks
- **Free Access**: Leverage powerful AI models without cost barriers

## 📦 Installation

```bash
pip install ailite
```

## 🚀 Quick Start

### DSPy Integration

```python
from ailite.dspy import HFLM

model = HFLM(model="Qwen/Qwen2.5-72B-Instruct")
```

### LangChain Integration

```python
from ailite.langchain import ChatOpenAI

chat_model = ChatOpenAI(model="mistralai/Mixtral-8x7B-Instruct-v0.1")
```

### AutoGen Integration

```python
from ailite.autogen import OpenAIChatCompletionClient

client = OpenAIChatCompletionClient(model="meta-llama/Llama-3.1-70B-Instruct")
```

### LlamaIndex Integration

```python
from ailite.llamaindex import OpenAI

llm = OpenAI(model="google/gemma-2-9b-it")
```

### DeepEval Integration

```python
from ailite.evallite import DeepEvalLLM

from deepeval.metrics import AnswerRelevancyMetric
from deepeval.test_case import LLMTestCase

answer_relevancy_metric = AnswerRelevancyMetric(threshold=0.7,
                                                model=DeepEvalLLM("NousResearch/Hermes-3-Llama-3.1-8B"))
test_case = LLMTestCase(
    input="What if these shoes don't fit?",
    # Replace this with the actual output from your LLM application
    actual_output="We offer a 30-day full refund at no extra costs.",
    retrieval_context=["All customers are eligible for a 30 day full refund at no extra costs."]
)

answer_relevancy_metric.measure(test_case)
```

## 🎯 Supported Models

AiLite supports a wide range of cutting-edge language models, including:

### Large Language Models
- Qwen (72B, 32B variants)
- Meta Llama 3 family
- Google Gemma series
- Mistral and Mixtral
- Microsoft Phi
- Yi models
- CodeLlama
- Falcon
- And many more!

For a complete list of supported models, check our [models documentation](docs/MODELS.md).

## 🛠️ Framework Support

Currently supported frameworks:
- DSPy
- LangChain
- AutoGen
- LlamaIndex

More frameworks coming soon!

## 📚 Documentation

For detailed documentation and examples, visit our [documentation site](docs/README.md).

### Examples

- [Basic Usage Examples](examples/basic_usage.md)
- [Framework-Specific Examples](examples/frameworks.md)
- [Advanced Usage Patterns](examples/advanced.md)

## 🤝 Contributing

We welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.

1. Fork the repository
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request

## 📝 License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## 🙏 Acknowledgements

Special thanks to the communities behind:
- DSPy
- LangChain
- AutoGen
- LlamaIndex

And to all the model providers for making their models accessible.

## 📫 Contact

- GitHub Issues: [Create an issue](https://github.com/yourusername/ailite/issues)
- Email: your.email@example.com

## ⭐ Star History

[![Star History Chart](https://api.star-history.com/svg?repos=yourusername/ailite&type=Date)](https://star-history.com/#yourusername/ailite&Date)

---

Made with ❤️
            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/santhosh/",
    "name": "ailite",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.10",
    "maintainer_email": null,
    "keywords": null,
    "author": "Kammari Santhosh",
    "author_email": null,
    "download_url": "https://files.pythonhosted.org/packages/93/7b/b002edda12a6d03a3b59d7c0d5bcda2d9b395a59f33457cb150fd0642b53/ailite-6.1.10.tar.gz",
    "platform": null,
    "description": "# AiLite \ud83d\ude80\n\nAiLite is a unified interface for accessing state-of-the-art language models through popular AI frameworks. It provides seamless integration with frameworks like DSPy, LangChain, AutoGen, and LlamaIndex while making advanced AI models accessible and free to use.\n\n[![PyPI version](https://badge.fury.io/py/ailite.svg)](https://badge.fury.io/py/ailite)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/release/python-380/)\n\n## \ud83c\udf1f Features\n\n- **Universal Framework Support**: Compatible with major AI frameworks including DSPy, LangChain, AutoGen, and LlamaIndex\n- **Access to Leading Models**: Support for 30+ cutting-edge language models from providers like:\n  - Qwen (72B, 32B models)\n  - Meta's Llama family\n  - Google's Gemma series\n  - Mistral and Mixtral\n  - Microsoft's Phi models\n  - And many more!\n- **Framework-Native Integration**: Use models with your favorite framework's native interfaces\n- **Consistent API**: Uniform experience across different frameworks\n- **Free Access**: Leverage powerful AI models without cost barriers\n\n## \ud83d\udce6 Installation\n\n```bash\npip install ailite\n```\n\n## \ud83d\ude80 Quick Start\n\n### DSPy Integration\n\n```python\nfrom ailite.dspy import HFLM\n\nmodel = HFLM(model=\"Qwen/Qwen2.5-72B-Instruct\")\n```\n\n### LangChain Integration\n\n```python\nfrom ailite.langchain import ChatOpenAI\n\nchat_model = ChatOpenAI(model=\"mistralai/Mixtral-8x7B-Instruct-v0.1\")\n```\n\n### AutoGen Integration\n\n```python\nfrom ailite.autogen import OpenAIChatCompletionClient\n\nclient = OpenAIChatCompletionClient(model=\"meta-llama/Llama-3.1-70B-Instruct\")\n```\n\n### LlamaIndex Integration\n\n```python\nfrom ailite.llamaindex import OpenAI\n\nllm = OpenAI(model=\"google/gemma-2-9b-it\")\n```\n\n### DeepEval Integration\n\n```python\nfrom ailite.evallite import DeepEvalLLM\n\nfrom deepeval.metrics import AnswerRelevancyMetric\nfrom deepeval.test_case import LLMTestCase\n\nanswer_relevancy_metric = AnswerRelevancyMetric(threshold=0.7,\n                                                model=DeepEvalLLM(\"NousResearch/Hermes-3-Llama-3.1-8B\"))\ntest_case = LLMTestCase(\n    input=\"What if these shoes don't fit?\",\n    # Replace this with the actual output from your LLM application\n    actual_output=\"We offer a 30-day full refund at no extra costs.\",\n    retrieval_context=[\"All customers are eligible for a 30 day full refund at no extra costs.\"]\n)\n\nanswer_relevancy_metric.measure(test_case)\n```\n\n## \ud83c\udfaf Supported Models\n\nAiLite supports a wide range of cutting-edge language models, including:\n\n### Large Language Models\n- Qwen (72B, 32B variants)\n- Meta Llama 3 family\n- Google Gemma series\n- Mistral and Mixtral\n- Microsoft Phi\n- Yi models\n- CodeLlama\n- Falcon\n- And many more!\n\nFor a complete list of supported models, check our [models documentation](docs/MODELS.md).\n\n## \ud83d\udee0\ufe0f Framework Support\n\nCurrently supported frameworks:\n- DSPy\n- LangChain\n- AutoGen\n- LlamaIndex\n\nMore frameworks coming soon!\n\n## \ud83d\udcda Documentation\n\nFor detailed documentation and examples, visit our [documentation site](docs/README.md).\n\n### Examples\n\n- [Basic Usage Examples](examples/basic_usage.md)\n- [Framework-Specific Examples](examples/frameworks.md)\n- [Advanced Usage Patterns](examples/advanced.md)\n\n## \ud83e\udd1d Contributing\n\nWe welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.\n\n1. Fork the repository\n2. Create your feature branch (`git checkout -b feature/amazing-feature`)\n3. Commit your changes (`git commit -m 'Add amazing feature'`)\n4. Push to the branch (`git push origin feature/amazing-feature`)\n5. Open a Pull Request\n\n## \ud83d\udcdd License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgements\n\nSpecial thanks to the communities behind:\n- DSPy\n- LangChain\n- AutoGen\n- LlamaIndex\n\nAnd to all the model providers for making their models accessible.\n\n## \ud83d\udceb Contact\n\n- GitHub Issues: [Create an issue](https://github.com/yourusername/ailite/issues)\n- Email: your.email@example.com\n\n## \u2b50 Star History\n\n[![Star History Chart](https://api.star-history.com/svg?repos=yourusername/ailite&type=Date)](https://star-history.com/#yourusername/ailite&Date)\n\n---\n\nMade with \u2764\ufe0f",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "All popular Framework HF integration kit",
    "version": "6.1.10",
    "project_urls": {
        "Homepage": "https://github.com/santhosh/",
        "Repository": "https://github.com/santhosh/"
    },
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "aff1ca63b690385c47561ed2a7628103856594a0548a89c37ee5c67da96e997a",
                "md5": "248fcf14ed4f179b2a5cb3ae7836d136",
                "sha256": "d52137c2ce9ff3f1b015bd58f5ab145c3a471edce725abe94f599ae165870365"
            },
            "downloads": -1,
            "filename": "ailite-6.1.10-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "248fcf14ed4f179b2a5cb3ae7836d136",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.10",
            "size": 13938,
            "upload_time": "2024-12-12T06:47:38",
            "upload_time_iso_8601": "2024-12-12T06:47:38.529829Z",
            "url": "https://files.pythonhosted.org/packages/af/f1/ca63b690385c47561ed2a7628103856594a0548a89c37ee5c67da96e997a/ailite-6.1.10-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "937bb002edda12a6d03a3b59d7c0d5bcda2d9b395a59f33457cb150fd0642b53",
                "md5": "72489791f2fc836257bc3c6ba01a94b8",
                "sha256": "8091db67e268660aa3d414763352bd9776fa67979de464ba2687266c671c8308"
            },
            "downloads": -1,
            "filename": "ailite-6.1.10.tar.gz",
            "has_sig": false,
            "md5_digest": "72489791f2fc836257bc3c6ba01a94b8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.10",
            "size": 12162,
            "upload_time": "2024-12-12T06:47:41",
            "upload_time_iso_8601": "2024-12-12T06:47:41.407983Z",
            "url": "https://files.pythonhosted.org/packages/93/7b/b002edda12a6d03a3b59d7c0d5bcda2d9b395a59f33457cb150fd0642b53/ailite-6.1.10.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-12 06:47:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "ailite"
}
        
Elapsed time: 4.00692s