lexa


Namelexa JSON
Version 1.0.3 PyPI version JSON
download
home_pageNone
SummaryPython SDK for Lexa AI - OpenAI-compatible interface for Lexa's language models
upload_time2025-08-31 19:44:42
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseMIT
keywords ai api chat completion lexa llm ml
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # Lexa Python SDK

[![PyPI version](https://badge.fury.io/py/lexa.svg)](https://pypi.org/project/lexa/)
[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

A Python SDK for Lexa AI that provides an OpenAI-compatible interface for easy integration with Lexa's language models. Built with automatic SSL configuration and zero-setup installation.

## ✨ Features

- 🔗 **OpenAI-Compatible**: Drop-in replacement for OpenAI SDK
- 🚀 **Async Support**: Full async/await support for high-performance applications
- 📦 **Type Safety**: Comprehensive type hints and validation with Pydantic
- 🔄 **Streaming**: Real-time streaming responses for interactive applications
- 🛡️ **Auto SSL**: Automatic SSL certificate handling - works out of the box
- 📊 **Multiple Models**: Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)
- 🔧 **Flexible Configuration**: Optional SSL and configuration overrides
- ⚡ **High Performance**: Optimized HTTP clients with connection pooling

## 📦 Installation

```bash
pip install lexa
```

## 🚀 Quick Start

```python
from lexa_sdk import Lexa

# Initialize the client with your API key
client = Lexa(api_key="your-api-key")

# Simple chat completion
response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])
```

## 📚 Available Models

| Model | Description | Context Window | Max Tokens | Use Case |
|-------|-------------|----------------|------------|----------|
| `lexa-mml` | Multimodal model with vision capabilities | 8,192 | 4,096 | General purpose with image understanding |
| `lexa-x1` | Fast, lightweight text-based model | 4,096 | 2,048 | Quick responses, simple tasks |
| `lexa-rho` | Reasoning model with enhanced capabilities | 16,384 | 8,192 | Complex reasoning, analysis |

## 🔧 Advanced Usage

### Async Support

```python
import asyncio
from lexa_sdk import Lexa

async def main():
    client = Lexa(api_key="your-api-key")

    # Async chat completion
    response = await client.chat.completions.acreate(
        model="lexa-mml",
        messages=[{"role": "user", "content": "Explain quantum computing"}],
        temperature=0.3
    )

    print(response["choices"][0]["message"]["content"])

asyncio.run(main())
```

### Streaming Responses

```python
from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# Streaming chat completion
stream = client.chat.completions.create(
    model="lexa-mml",
    messages=[{"role": "user", "content": "Write a short story"}],
    temperature=0.8,
    stream=True
)

for chunk in stream:
    if chunk["choices"][0]["delta"].get("content"):
        print(chunk["choices"][0]["delta"]["content"], end="", flush=True)
```

### Custom SSL Configuration

```python
from lexa_sdk import Lexa

# For environments with SSL issues (not recommended for production)
client = Lexa(
    api_key="your-api-key",
    verify_ssl=False  # ⚠️  Only use if necessary
)

# Or use enhanced SSL (default behavior)
client = Lexa(
    api_key="your-api-key",
    enhanced_ssl=True  # Automatically download and use correct certificates
)
```

## 🛠️ API Reference

### Client Methods

- `client.chat.completions.create()` - Create chat completion
- `client.chat.completions.acreate()` - Async chat completion
- `client.models.list()` - List available models
- `client.models.alist()` - Async list models

### Parameters

- `model`: Model to use (required)
- `messages`: List of messages (required)
- `temperature`: Sampling temperature (0.0 to 2.0)
- `max_tokens`: Maximum tokens to generate
- `stream`: Enable streaming responses
- `top_p`: Nucleus sampling parameter
- `frequency_penalty`: Frequency penalty
- `presence_penalty`: Presence penalty

## 🔒 Security & SSL

The Lexa SDK automatically handles SSL certificate verification:
- **Default**: Uses enhanced SSL with automatic certificate management
- **Fallback**: Gracefully falls back to standard SSL verification
- **Manual Override**: Allows custom SSL configuration when needed

## 📖 Documentation

For complete documentation, examples, and API reference, visit:
- [Official Documentation](https://docs.lexa.chat/)
- [GitHub Repository](https://github.com/Robi-Labs/lexa-python-sdk)
- [Issue Tracker](https://github.com/Robi-Labs/lexa-python-sdk/issues)

## 🤝 Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

## 📄 License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## 🙏 Acknowledgments

- Built with ❤️ by [Robi Labs](https://robiai.com/)
- Compatible with OpenAI API specifications
- Powered by Lexa's advanced AI models

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "lexa",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "ai, api, chat, completion, lexa, llm, ml",
    "author": null,
    "author_email": "Robi Labs <lexa@robiai.com>",
    "download_url": "https://files.pythonhosted.org/packages/8a/11/4d047f880fa054b5e7ac58a2cff5072f22cb7f1f35b79d6650b1af75c92a/lexa-1.0.3.tar.gz",
    "platform": null,
    "description": "# Lexa Python SDK\n\n[![PyPI version](https://badge.fury.io/py/lexa.svg)](https://pypi.org/project/lexa/)\n[![Python 3.8+](https://img.shields.io/badge/python-3.8+-blue.svg)](https://www.python.org/downloads/)\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n\nA Python SDK for Lexa AI that provides an OpenAI-compatible interface for easy integration with Lexa's language models. Built with automatic SSL configuration and zero-setup installation.\n\n## \u2728 Features\n\n- \ud83d\udd17 **OpenAI-Compatible**: Drop-in replacement for OpenAI SDK\n- \ud83d\ude80 **Async Support**: Full async/await support for high-performance applications\n- \ud83d\udce6 **Type Safety**: Comprehensive type hints and validation with Pydantic\n- \ud83d\udd04 **Streaming**: Real-time streaming responses for interactive applications\n- \ud83d\udee1\ufe0f **Auto SSL**: Automatic SSL certificate handling - works out of the box\n- \ud83d\udcca **Multiple Models**: Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)\n- \ud83d\udd27 **Flexible Configuration**: Optional SSL and configuration overrides\n- \u26a1 **High Performance**: Optimized HTTP clients with connection pooling\n\n## \ud83d\udce6 Installation\n\n```bash\npip install lexa\n```\n\n## \ud83d\ude80 Quick Start\n\n```python\nfrom lexa_sdk import Lexa\n\n# Initialize the client with your API key\nclient = Lexa(api_key=\"your-api-key\")\n\n# Simple chat completion\nresponse = client.chat.completions.create(\n    model=\"lexa-mml\",\n    messages=[\n        {\"role\": \"user\", \"content\": \"Hello! Tell me a joke.\"}\n    ],\n    temperature=0.7,\n    max_tokens=100\n)\n\nprint(response[\"choices\"][0][\"message\"][\"content\"])\n```\n\n## \ud83d\udcda Available Models\n\n| Model | Description | Context Window | Max Tokens | Use Case |\n|-------|-------------|----------------|------------|----------|\n| `lexa-mml` | Multimodal model with vision capabilities | 8,192 | 4,096 | General purpose with image understanding |\n| `lexa-x1` | Fast, lightweight text-based model | 4,096 | 2,048 | Quick responses, simple tasks |\n| `lexa-rho` | Reasoning model with enhanced capabilities | 16,384 | 8,192 | Complex reasoning, analysis |\n\n## \ud83d\udd27 Advanced Usage\n\n### Async Support\n\n```python\nimport asyncio\nfrom lexa_sdk import Lexa\n\nasync def main():\n    client = Lexa(api_key=\"your-api-key\")\n\n    # Async chat completion\n    response = await client.chat.completions.acreate(\n        model=\"lexa-mml\",\n        messages=[{\"role\": \"user\", \"content\": \"Explain quantum computing\"}],\n        temperature=0.3\n    )\n\n    print(response[\"choices\"][0][\"message\"][\"content\"])\n\nasyncio.run(main())\n```\n\n### Streaming Responses\n\n```python\nfrom lexa_sdk import Lexa\n\nclient = Lexa(api_key=\"your-api-key\")\n\n# Streaming chat completion\nstream = client.chat.completions.create(\n    model=\"lexa-mml\",\n    messages=[{\"role\": \"user\", \"content\": \"Write a short story\"}],\n    temperature=0.8,\n    stream=True\n)\n\nfor chunk in stream:\n    if chunk[\"choices\"][0][\"delta\"].get(\"content\"):\n        print(chunk[\"choices\"][0][\"delta\"][\"content\"], end=\"\", flush=True)\n```\n\n### Custom SSL Configuration\n\n```python\nfrom lexa_sdk import Lexa\n\n# For environments with SSL issues (not recommended for production)\nclient = Lexa(\n    api_key=\"your-api-key\",\n    verify_ssl=False  # \u26a0\ufe0f  Only use if necessary\n)\n\n# Or use enhanced SSL (default behavior)\nclient = Lexa(\n    api_key=\"your-api-key\",\n    enhanced_ssl=True  # Automatically download and use correct certificates\n)\n```\n\n## \ud83d\udee0\ufe0f API Reference\n\n### Client Methods\n\n- `client.chat.completions.create()` - Create chat completion\n- `client.chat.completions.acreate()` - Async chat completion\n- `client.models.list()` - List available models\n- `client.models.alist()` - Async list models\n\n### Parameters\n\n- `model`: Model to use (required)\n- `messages`: List of messages (required)\n- `temperature`: Sampling temperature (0.0 to 2.0)\n- `max_tokens`: Maximum tokens to generate\n- `stream`: Enable streaming responses\n- `top_p`: Nucleus sampling parameter\n- `frequency_penalty`: Frequency penalty\n- `presence_penalty`: Presence penalty\n\n## \ud83d\udd12 Security & SSL\n\nThe Lexa SDK automatically handles SSL certificate verification:\n- **Default**: Uses enhanced SSL with automatic certificate management\n- **Fallback**: Gracefully falls back to standard SSL verification\n- **Manual Override**: Allows custom SSL configuration when needed\n\n## \ud83d\udcd6 Documentation\n\nFor complete documentation, examples, and API reference, visit:\n- [Official Documentation](https://docs.lexa.chat/)\n- [GitHub Repository](https://github.com/Robi-Labs/lexa-python-sdk)\n- [Issue Tracker](https://github.com/Robi-Labs/lexa-python-sdk/issues)\n\n## \ud83e\udd1d Contributing\n\nContributions are welcome! Please feel free to submit a Pull Request.\n\n## \ud83d\udcc4 License\n\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\n\n## \ud83d\ude4f Acknowledgments\n\n- Built with \u2764\ufe0f by [Robi Labs](https://robiai.com/)\n- Compatible with OpenAI API specifications\n- Powered by Lexa's advanced AI models\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "Python SDK for Lexa AI - OpenAI-compatible interface for Lexa's language models",
    "version": "1.0.3",
    "project_urls": {
        "Bug Tracker": "https://github.com/Robi-Labs/lexa-python-sdk/issues",
        "Documentation": "https://docs.lexa.chat/",
        "Homepage": "https://lexa.chat",
        "Repository": "https://github.com/Robi-Labs/lexa-python-sdk"
    },
    "split_keywords": [
        "ai",
        " api",
        " chat",
        " completion",
        " lexa",
        " llm",
        " ml"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "e80bac78adc3a9d78b202489a14e1c845caa462d6896122ff92b915752c7d0fc",
                "md5": "4003d1f6498f1cbb0ec1c8d55f0fa35d",
                "sha256": "01d89e741eeabd9b6559ba6dac45697e5d8c80ba958983ecad49eb4d51caf38c"
            },
            "downloads": -1,
            "filename": "lexa-1.0.3-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "4003d1f6498f1cbb0ec1c8d55f0fa35d",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 24402,
            "upload_time": "2025-08-31T19:44:41",
            "upload_time_iso_8601": "2025-08-31T19:44:41.303905Z",
            "url": "https://files.pythonhosted.org/packages/e8/0b/ac78adc3a9d78b202489a14e1c845caa462d6896122ff92b915752c7d0fc/lexa-1.0.3-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "8a114d047f880fa054b5e7ac58a2cff5072f22cb7f1f35b79d6650b1af75c92a",
                "md5": "0c84577e6e7ea589f17144f3a919da8d",
                "sha256": "56a357e4581c17da8e252adcb7171ee1583e17ef07dc6e442c7a1cb0f0c9a589"
            },
            "downloads": -1,
            "filename": "lexa-1.0.3.tar.gz",
            "has_sig": false,
            "md5_digest": "0c84577e6e7ea589f17144f3a919da8d",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 52878,
            "upload_time": "2025-08-31T19:44:42",
            "upload_time_iso_8601": "2025-08-31T19:44:42.859650Z",
            "url": "https://files.pythonhosted.org/packages/8a/11/4d047f880fa054b5e7ac58a2cff5072f22cb7f1f35b79d6650b1af75c92a/lexa-1.0.3.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-08-31 19:44:42",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "Robi-Labs",
    "github_project": "lexa-python-sdk",
    "github_not_found": true,
    "lcname": "lexa"
}
        
Elapsed time: 1.94305s