llmeasy


Namellmeasy JSON
Version 0.1.5 PyPI version JSON
download
home_pageNone
SummaryEasy to use LLM interface
upload_time2024-12-09 22:37:41
maintainerNone
docs_urlNone
authorKevin Wong
requires_python<4.0,>=3.9
licenseApache-2.0
keywords
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # AI Chat Framework

A flexible framework for integrating multiple AI chat providers (OpenAI, Claude, Gemini, Mistral, Grok).

## Installation

```bash
pip install llmeasy
```

## Setup

1. Copy `.env.example` to `.env`
2. Add your API keys to `.env`
3. Install dependencies: `poetry install`

## Environment Variables

### API Keys
- `ANTHROPIC_API_KEY`
- `OPENAI_API_KEY` 
- `GOOGLE_API_KEY`
- `MISTRAL_API_KEY`
- `GROK_API_KEY`

### Models
- `CLAUDE_MODEL` (default: claude-3-sonnet-20240229)
- `OPENAI_MODEL` (default: gpt-4-turbo-preview)
- `GEMINI_MODEL` (default: gemini-pro)
- `MISTRAL_MODEL` (default: mistral-large-latest)
- `GROK_MODEL` (default: grok-beta)

### Config
- `MAX_TOKENS` (default: 1000)
- `TEMPERATURE` (default: 0.7)

## Examples

Run all examples:
```bash
python examples/run_all_examples.py
```

Available examples in `examples/`:
- Basic usage
- Provider-specific implementations
- Advanced patterns
- Custom templates
- Provider comparisons
- Streaming
- Provider chaining

## Features

- Multi-provider support
- Async/await
- Streaming responses
- Custom templates
- Provider chaining
- Error handling
- Type hints

## License

Apache License 2.0
            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "llmeasy",
    "maintainer": null,
    "docs_url": null,
    "requires_python": "<4.0,>=3.9",
    "maintainer_email": null,
    "keywords": null,
    "author": "Kevin Wong",
    "author_email": "kevinchwong@gmail.com",
    "download_url": "https://files.pythonhosted.org/packages/19/b5/395a67bb92fa8f0eab0b600c0b9ed711c1d8b0df1d7fabec708dc909e674/llmeasy-0.1.5.tar.gz",
    "platform": null,
    "description": "# AI Chat Framework\n\nA flexible framework for integrating multiple AI chat providers (OpenAI, Claude, Gemini, Mistral, Grok).\n\n## Installation\n\n```bash\npip install llmeasy\n```\n\n## Setup\n\n1. Copy `.env.example` to `.env`\n2. Add your API keys to `.env`\n3. Install dependencies: `poetry install`\n\n## Environment Variables\n\n### API Keys\n- `ANTHROPIC_API_KEY`\n- `OPENAI_API_KEY` \n- `GOOGLE_API_KEY`\n- `MISTRAL_API_KEY`\n- `GROK_API_KEY`\n\n### Models\n- `CLAUDE_MODEL` (default: claude-3-sonnet-20240229)\n- `OPENAI_MODEL` (default: gpt-4-turbo-preview)\n- `GEMINI_MODEL` (default: gemini-pro)\n- `MISTRAL_MODEL` (default: mistral-large-latest)\n- `GROK_MODEL` (default: grok-beta)\n\n### Config\n- `MAX_TOKENS` (default: 1000)\n- `TEMPERATURE` (default: 0.7)\n\n## Examples\n\nRun all examples:\n```bash\npython examples/run_all_examples.py\n```\n\nAvailable examples in `examples/`:\n- Basic usage\n- Provider-specific implementations\n- Advanced patterns\n- Custom templates\n- Provider comparisons\n- Streaming\n- Provider chaining\n\n## Features\n\n- Multi-provider support\n- Async/await\n- Streaming responses\n- Custom templates\n- Provider chaining\n- Error handling\n- Type hints\n\n## License\n\nApache License 2.0",
    "bugtrack_url": null,
    "license": "Apache-2.0",
    "summary": "Easy to use LLM interface",
    "version": "0.1.5",
    "project_urls": null,
    "split_keywords": [],
    "urls": [
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "00f6da26052d951a74ee012c4b77ea9e0694d3a8eed38eb1c0021446741f2d9d",
                "md5": "1e3de3b3dfc022779fbe1cf32dc9cd62",
                "sha256": "730783e9fae63071bb55720dbb62ad9a7503780a803ebbf30082c5f8f743887a"
            },
            "downloads": -1,
            "filename": "llmeasy-0.1.5-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "1e3de3b3dfc022779fbe1cf32dc9cd62",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": "<4.0,>=3.9",
            "size": 17237,
            "upload_time": "2024-12-09T22:37:39",
            "upload_time_iso_8601": "2024-12-09T22:37:39.914753Z",
            "url": "https://files.pythonhosted.org/packages/00/f6/da26052d951a74ee012c4b77ea9e0694d3a8eed38eb1c0021446741f2d9d/llmeasy-0.1.5-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": "",
            "digests": {
                "blake2b_256": "19b5395a67bb92fa8f0eab0b600c0b9ed711c1d8b0df1d7fabec708dc909e674",
                "md5": "a091ee5cc6f8600e2a8b425132d0a43c",
                "sha256": "6a610a5e52f32234ece5959aad44ef26f5ab71114e28fbb82739536927caaa55"
            },
            "downloads": -1,
            "filename": "llmeasy-0.1.5.tar.gz",
            "has_sig": false,
            "md5_digest": "a091ee5cc6f8600e2a8b425132d0a43c",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": "<4.0,>=3.9",
            "size": 11248,
            "upload_time": "2024-12-09T22:37:41",
            "upload_time_iso_8601": "2024-12-09T22:37:41.509538Z",
            "url": "https://files.pythonhosted.org/packages/19/b5/395a67bb92fa8f0eab0b600c0b9ed711c1d8b0df1d7fabec708dc909e674/llmeasy-0.1.5.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2024-12-09 22:37:41",
    "github": false,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "lcname": "llmeasy"
}
        
Elapsed time: 0.42056s