aikitx


Nameaikitx JSON
Version 1.0.0 PyPI version JSON
download
home_pageNone
SummaryA comprehensive GUI toolkit for Large Language Models (LLMs) with GGUF support, document processing, email automation, and multi-backend inference
upload_time2025-07-25 19:44:31
maintainerNone
docs_urlNone
authorNone
requires_python>=3.8
licenseNone
keywords llm ai machine-learning gguf transformers chatbot nlp gui pyside6 llama-cpp ctransformers huggingface document-processing email-automation pdf-processing summarization inference neural-networks deep-learning language-models text-generation conversation-ai
VCS
bugtrack_url
requirements PySide6 pytest pytest-qt pytest-cov pytest-xvfb pytest-timeout pytest-mock pytest-benchmark pytest-html pytest-json-report pytest-parallel pyyaml jsonschema appdirs networkx psutil llama-cpp-python ctransformers pynvml PyPDF2 pdfplumber python-docx google-auth google-auth-oauthlib google-api-python-client black isort flake8 mypy
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # LLM Toolkit

[![PyPI version](https://badge.fury.io/py/llmtoolkit.svg)](https://badge.fury.io/py/llmtoolkit)
[![Python Support](https://img.shields.io/pypi/pyversions/llmtoolkit.svg)](https://pypi.org/project/llmtoolkit/)
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)

A comprehensive toolkit for working with Large Language Models (LLMs) that provides an intuitive GUI interface for model loading, chat interactions, document summarization, and email automation. Built with modern Python technologies and designed for both developers and end-users.

## Features

### 🤖 Multiple Model Backends
- **GGUF Support**: Optimized inference with ctransformers and llama-cpp-python
- **Hugging Face Integration**: Direct model loading from HF Hub (optional)
- **Hardware Detection**: Automatic GPU/CPU optimization
- **Memory Management**: Intelligent resource allocation

### 💬 Advanced Chat Interface
- **Interactive Conversations**: Real-time chat with loaded models
- **History Management**: Persistent conversation storage
- **Parameter Control**: Fine-tune generation settings
- **Context Awareness**: Maintain conversation context

### 📄 Document Processing
- **Multi-format Support**: PDF, Word, and text documents
- **Intelligent Summarization**: AI-powered content extraction
- **Chunked Processing**: Handle large documents efficiently
- **Batch Operations**: Process multiple files simultaneously

### 📧 Email Automation
- **Gmail Integration**: Secure OAuth2 authentication
- **AI-Powered Drafting**: Generate professional emails
- **Smart Replies**: Context-aware response generation
- **Bulk Operations**: Marketing and communication automation

### 🎨 Modern User Interface
- **Cross-Platform**: Windows, macOS, and Linux support
- **Theme Support**: Dark and light mode options
- **Responsive Design**: Adaptive layout for different screen sizes
- **Accessibility**: Keyboard shortcuts and screen reader support

### ⚡ Performance & Reliability
- **Multi-threading**: Non-blocking UI operations
- **Resource Monitoring**: Real-time memory and CPU tracking
- **Error Recovery**: Graceful handling of failures
- **Logging System**: Comprehensive debugging information

## Quick Start

1. **Install the package:**
   ```bash
   pip install llmtoolkit
   ```

2. **Launch the application:**
   ```bash
   llmtoolkit
   ```

3. **Load a model and start chatting!**

## Installation

### Basic Installation

```bash
pip install llmtoolkit
```

### With Optional Dependencies

For Hugging Face transformers support:
```bash
pip install llmtoolkit[transformers]
```

For GPU acceleration:
```bash
pip install llmtoolkit[gpu]
```

For all features:
```bash
pip install llmtoolkit[all]
```

## Usage

### Command Line

After installation, you can launch the application with:

```bash
llmtoolkit
```

### Command Line Options

```bash
llmtoolkit --help          # Show help message
llmtoolkit --version       # Show version information
llmtoolkit --model PATH    # Load a specific model on startup
llmtoolkit --debug         # Enable debug logging
```

### Python Module

You can also run it as a Python module:

```bash
python -m llmtoolkit
```

### Programmatic Usage

```python
import llmtoolkit

# Launch the GUI application
llmtoolkit.main()

# Or access specific components
from llmtoolkit.app.core import ModelService
model_service = ModelService()
```

## Supported Model Formats

- **GGUF** (.gguf) - Recommended format for efficient inference
- **GGML** (.ggml) - Legacy format support
- **Hugging Face** - Direct model loading from HF Hub (with transformers extra)
- **PyTorch** (.bin, .pt, .pth) - PyTorch model files
- **Safetensors** (.safetensors) - Safe tensor format

## System Requirements

- **Python**: 3.8 or higher
- **Operating System**: Windows, macOS, or Linux
- **Memory**: 4GB RAM minimum (8GB+ recommended for larger models)
- **Storage**: 2GB free space (plus space for models)
- **GPU** (optional): NVIDIA CUDA, AMD ROCm, or Apple Metal support

## Configuration

The application stores configuration and data in:
- **Windows**: `%APPDATA%\llmtoolkit\`
- **macOS**: `~/Library/Application Support/llmtoolkit/`
- **Linux**: `~/.config/llmtoolkit/`

## Troubleshooting

### Common Issues

**Installation Problems:**
- Ensure you have Python 3.8+ installed
- Try upgrading pip: `pip install --upgrade pip`
- For GPU support issues, check your CUDA/ROCm installation

**Model Loading Issues:**
- Verify model file format is supported (GGUF recommended)
- Check available system memory
- Ensure model file is not corrupted

**GUI Not Starting:**
- Install GUI dependencies: `pip install llmtoolkit[all]`
- On Linux, ensure X11 forwarding is enabled if using SSH
- Check system compatibility with PySide6

**Performance Issues:**
- Close other memory-intensive applications
- Use smaller models for limited hardware
- Enable GPU acceleration if available

## Development

### Setting up Development Environment

```bash
git clone https://github.com/hussainnazary2/LLM-Toolkit.git
cd LLM-Toolkit
pip install -e .[dev]
```

### Running Tests

```bash
pytest
```

### Code Formatting

```bash
black llmtoolkit/
isort llmtoolkit/
```

## Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

## License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Acknowledgments

- Built with [PySide6](https://doc.qt.io/qtforpython/) for the GUI framework
- Model loading powered by [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) and [ctransformers](https://github.com/marella/ctransformers)
- Optional Hugging Face integration via [transformers](https://github.com/huggingface/transformers)

## Changelog

See [CHANGELOG.md](https://github.com/hussainnazary2/LLM-Toolkit/releases) for version history and updates.

## Support

If you encounter any issues or have questions:

1. Check the [documentation](https://github.com/hussainnazary2/LLM-Toolkit#readme)
2. Search [existing issues](https://github.com/hussainnazary2/LLM-Toolkit/issues)
3. Create a [new issue](https://github.com/hussainnazary2/LLM-Toolkit/issues/new) if needed
4. Contact the developer: [hussainnazary475@gmail.com](mailto:hussainnazary475@gmail.com)

## Author

**Hussain Nazary**
- Email: [hussainnazary475@gmail.com](mailto:hussainnazary475@gmail.com)
- GitHub: [@hussainnazary2](https://github.com/hussainnazary2)
- Project: [LLM-Toolkit](https://github.com/hussainnazary2/LLM-Toolkit)

---

**Made with ❤️ for the AI community**

            

Raw data

            {
    "_id": null,
    "home_page": null,
    "name": "aikitx",
    "maintainer": null,
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": null,
    "keywords": "llm, ai, machine-learning, gguf, transformers, chatbot, nlp, gui, pyside6, llama-cpp, ctransformers, huggingface, document-processing, email-automation, pdf-processing, summarization, inference, neural-networks, deep-learning, language-models, text-generation, conversation-ai",
    "author": null,
    "author_email": "Hussain Nazary <hussainnazary475@gmail.com>",
    "download_url": "https://files.pythonhosted.org/packages/4a/c0/deebd4186ae1a6238958761bc93fa876673b46b1b4ee62bec165b65344a4/aikitx-1.0.0.tar.gz",
    "platform": null,
    "description": "# LLM Toolkit\r\n\r\n[![PyPI version](https://badge.fury.io/py/llmtoolkit.svg)](https://badge.fury.io/py/llmtoolkit)\r\n[![Python Support](https://img.shields.io/pypi/pyversions/llmtoolkit.svg)](https://pypi.org/project/llmtoolkit/)\r\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\r\n\r\nA comprehensive toolkit for working with Large Language Models (LLMs) that provides an intuitive GUI interface for model loading, chat interactions, document summarization, and email automation. Built with modern Python technologies and designed for both developers and end-users.\r\n\r\n## Features\r\n\r\n### \ud83e\udd16 Multiple Model Backends\r\n- **GGUF Support**: Optimized inference with ctransformers and llama-cpp-python\r\n- **Hugging Face Integration**: Direct model loading from HF Hub (optional)\r\n- **Hardware Detection**: Automatic GPU/CPU optimization\r\n- **Memory Management**: Intelligent resource allocation\r\n\r\n### \ud83d\udcac Advanced Chat Interface\r\n- **Interactive Conversations**: Real-time chat with loaded models\r\n- **History Management**: Persistent conversation storage\r\n- **Parameter Control**: Fine-tune generation settings\r\n- **Context Awareness**: Maintain conversation context\r\n\r\n### \ud83d\udcc4 Document Processing\r\n- **Multi-format Support**: PDF, Word, and text documents\r\n- **Intelligent Summarization**: AI-powered content extraction\r\n- **Chunked Processing**: Handle large documents efficiently\r\n- **Batch Operations**: Process multiple files simultaneously\r\n\r\n### \ud83d\udce7 Email Automation\r\n- **Gmail Integration**: Secure OAuth2 authentication\r\n- **AI-Powered Drafting**: Generate professional emails\r\n- **Smart Replies**: Context-aware response generation\r\n- **Bulk Operations**: Marketing and communication automation\r\n\r\n### \ud83c\udfa8 Modern User Interface\r\n- **Cross-Platform**: Windows, macOS, and Linux support\r\n- **Theme Support**: Dark and light mode options\r\n- **Responsive Design**: Adaptive layout for different screen sizes\r\n- **Accessibility**: Keyboard shortcuts and screen reader support\r\n\r\n### \u26a1 Performance & Reliability\r\n- **Multi-threading**: Non-blocking UI operations\r\n- **Resource Monitoring**: Real-time memory and CPU tracking\r\n- **Error Recovery**: Graceful handling of failures\r\n- **Logging System**: Comprehensive debugging information\r\n\r\n## Quick Start\r\n\r\n1. **Install the package:**\r\n   ```bash\r\n   pip install llmtoolkit\r\n   ```\r\n\r\n2. **Launch the application:**\r\n   ```bash\r\n   llmtoolkit\r\n   ```\r\n\r\n3. **Load a model and start chatting!**\r\n\r\n## Installation\r\n\r\n### Basic Installation\r\n\r\n```bash\r\npip install llmtoolkit\r\n```\r\n\r\n### With Optional Dependencies\r\n\r\nFor Hugging Face transformers support:\r\n```bash\r\npip install llmtoolkit[transformers]\r\n```\r\n\r\nFor GPU acceleration:\r\n```bash\r\npip install llmtoolkit[gpu]\r\n```\r\n\r\nFor all features:\r\n```bash\r\npip install llmtoolkit[all]\r\n```\r\n\r\n## Usage\r\n\r\n### Command Line\r\n\r\nAfter installation, you can launch the application with:\r\n\r\n```bash\r\nllmtoolkit\r\n```\r\n\r\n### Command Line Options\r\n\r\n```bash\r\nllmtoolkit --help          # Show help message\r\nllmtoolkit --version       # Show version information\r\nllmtoolkit --model PATH    # Load a specific model on startup\r\nllmtoolkit --debug         # Enable debug logging\r\n```\r\n\r\n### Python Module\r\n\r\nYou can also run it as a Python module:\r\n\r\n```bash\r\npython -m llmtoolkit\r\n```\r\n\r\n### Programmatic Usage\r\n\r\n```python\r\nimport llmtoolkit\r\n\r\n# Launch the GUI application\r\nllmtoolkit.main()\r\n\r\n# Or access specific components\r\nfrom llmtoolkit.app.core import ModelService\r\nmodel_service = ModelService()\r\n```\r\n\r\n## Supported Model Formats\r\n\r\n- **GGUF** (.gguf) - Recommended format for efficient inference\r\n- **GGML** (.ggml) - Legacy format support\r\n- **Hugging Face** - Direct model loading from HF Hub (with transformers extra)\r\n- **PyTorch** (.bin, .pt, .pth) - PyTorch model files\r\n- **Safetensors** (.safetensors) - Safe tensor format\r\n\r\n## System Requirements\r\n\r\n- **Python**: 3.8 or higher\r\n- **Operating System**: Windows, macOS, or Linux\r\n- **Memory**: 4GB RAM minimum (8GB+ recommended for larger models)\r\n- **Storage**: 2GB free space (plus space for models)\r\n- **GPU** (optional): NVIDIA CUDA, AMD ROCm, or Apple Metal support\r\n\r\n## Configuration\r\n\r\nThe application stores configuration and data in:\r\n- **Windows**: `%APPDATA%\\llmtoolkit\\`\r\n- **macOS**: `~/Library/Application Support/llmtoolkit/`\r\n- **Linux**: `~/.config/llmtoolkit/`\r\n\r\n## Troubleshooting\r\n\r\n### Common Issues\r\n\r\n**Installation Problems:**\r\n- Ensure you have Python 3.8+ installed\r\n- Try upgrading pip: `pip install --upgrade pip`\r\n- For GPU support issues, check your CUDA/ROCm installation\r\n\r\n**Model Loading Issues:**\r\n- Verify model file format is supported (GGUF recommended)\r\n- Check available system memory\r\n- Ensure model file is not corrupted\r\n\r\n**GUI Not Starting:**\r\n- Install GUI dependencies: `pip install llmtoolkit[all]`\r\n- On Linux, ensure X11 forwarding is enabled if using SSH\r\n- Check system compatibility with PySide6\r\n\r\n**Performance Issues:**\r\n- Close other memory-intensive applications\r\n- Use smaller models for limited hardware\r\n- Enable GPU acceleration if available\r\n\r\n## Development\r\n\r\n### Setting up Development Environment\r\n\r\n```bash\r\ngit clone https://github.com/hussainnazary2/LLM-Toolkit.git\r\ncd LLM-Toolkit\r\npip install -e .[dev]\r\n```\r\n\r\n### Running Tests\r\n\r\n```bash\r\npytest\r\n```\r\n\r\n### Code Formatting\r\n\r\n```bash\r\nblack llmtoolkit/\r\nisort llmtoolkit/\r\n```\r\n\r\n## Contributing\r\n\r\nContributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.\r\n\r\n## License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## Acknowledgments\r\n\r\n- Built with [PySide6](https://doc.qt.io/qtforpython/) for the GUI framework\r\n- Model loading powered by [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) and [ctransformers](https://github.com/marella/ctransformers)\r\n- Optional Hugging Face integration via [transformers](https://github.com/huggingface/transformers)\r\n\r\n## Changelog\r\n\r\nSee [CHANGELOG.md](https://github.com/hussainnazary2/LLM-Toolkit/releases) for version history and updates.\r\n\r\n## Support\r\n\r\nIf you encounter any issues or have questions:\r\n\r\n1. Check the [documentation](https://github.com/hussainnazary2/LLM-Toolkit#readme)\r\n2. Search [existing issues](https://github.com/hussainnazary2/LLM-Toolkit/issues)\r\n3. Create a [new issue](https://github.com/hussainnazary2/LLM-Toolkit/issues/new) if needed\r\n4. Contact the developer: [hussainnazary475@gmail.com](mailto:hussainnazary475@gmail.com)\r\n\r\n## Author\r\n\r\n**Hussain Nazary**\r\n- Email: [hussainnazary475@gmail.com](mailto:hussainnazary475@gmail.com)\r\n- GitHub: [@hussainnazary2](https://github.com/hussainnazary2)\r\n- Project: [LLM-Toolkit](https://github.com/hussainnazary2/LLM-Toolkit)\r\n\r\n---\r\n\r\n**Made with \u2764\ufe0f for the AI community**\r\n",
    "bugtrack_url": null,
    "license": null,
    "summary": "A comprehensive GUI toolkit for Large Language Models (LLMs) with GGUF support, document processing, email automation, and multi-backend inference",
    "version": "1.0.0",
    "project_urls": {
        "Bug Tracker": "https://github.com/hussainnazary2/LLM-Toolkit/issues",
        "Changelog": "https://github.com/hussainnazary2/LLM-Toolkit/releases",
        "Documentation": "https://github.com/hussainnazary2/LLM-Toolkit#readme",
        "Homepage": "https://github.com/hussainnazary2/LLM-Toolkit",
        "Repository": "https://github.com/hussainnazary2/LLM-Toolkit"
    },
    "split_keywords": [
        "llm",
        " ai",
        " machine-learning",
        " gguf",
        " transformers",
        " chatbot",
        " nlp",
        " gui",
        " pyside6",
        " llama-cpp",
        " ctransformers",
        " huggingface",
        " document-processing",
        " email-automation",
        " pdf-processing",
        " summarization",
        " inference",
        " neural-networks",
        " deep-learning",
        " language-models",
        " text-generation",
        " conversation-ai"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "838abd109b157d13888aa79f00830bba05f1f8031746716e24c0651b942d006b",
                "md5": "7fca78f432ec2b7f56d920c04cc1d725",
                "sha256": "3e88a6c2d1ca2d2f9d6db95501a5e2e5d92f01ad993e68909324586c0bd63a60"
            },
            "downloads": -1,
            "filename": "aikitx-1.0.0-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "7fca78f432ec2b7f56d920c04cc1d725",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 686143,
            "upload_time": "2025-07-25T19:44:16",
            "upload_time_iso_8601": "2025-07-25T19:44:16.661162Z",
            "url": "https://files.pythonhosted.org/packages/83/8a/bd109b157d13888aa79f00830bba05f1f8031746716e24c0651b942d006b/aikitx-1.0.0-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "4ac0deebd4186ae1a6238958761bc93fa876673b46b1b4ee62bec165b65344a4",
                "md5": "d95d4332e7819d143520def3db72d7f8",
                "sha256": "b7d50b79f2400ef778258cd8c225c7c5d69b6c3dcb324b1eed4889d19c762197"
            },
            "downloads": -1,
            "filename": "aikitx-1.0.0.tar.gz",
            "has_sig": false,
            "md5_digest": "d95d4332e7819d143520def3db72d7f8",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 619448,
            "upload_time": "2025-07-25T19:44:31",
            "upload_time_iso_8601": "2025-07-25T19:44:31.727099Z",
            "url": "https://files.pythonhosted.org/packages/4a/c0/deebd4186ae1a6238958761bc93fa876673b46b1b4ee62bec165b65344a4/aikitx-1.0.0.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-25 19:44:31",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "hussainnazary2",
    "github_project": "LLM-Toolkit",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "requirements": [
        {
            "name": "PySide6",
            "specs": [
                [
                    ">=",
                    "6.4.0"
                ]
            ]
        },
        {
            "name": "pytest",
            "specs": [
                [
                    ">=",
                    "7.0.0"
                ]
            ]
        },
        {
            "name": "pytest-qt",
            "specs": [
                [
                    ">=",
                    "4.0.0"
                ]
            ]
        },
        {
            "name": "pytest-cov",
            "specs": [
                [
                    ">=",
                    "4.1.0"
                ]
            ]
        },
        {
            "name": "pytest-xvfb",
            "specs": [
                [
                    ">=",
                    "3.0.0"
                ]
            ]
        },
        {
            "name": "pytest-timeout",
            "specs": [
                [
                    ">=",
                    "2.1.0"
                ]
            ]
        },
        {
            "name": "pytest-mock",
            "specs": [
                [
                    ">=",
                    "3.10.0"
                ]
            ]
        },
        {
            "name": "pytest-benchmark",
            "specs": [
                [
                    ">=",
                    "4.0.0"
                ]
            ]
        },
        {
            "name": "pytest-html",
            "specs": [
                [
                    ">=",
                    "3.1.0"
                ]
            ]
        },
        {
            "name": "pytest-json-report",
            "specs": [
                [
                    ">=",
                    "1.5.0"
                ]
            ]
        },
        {
            "name": "pytest-parallel",
            "specs": [
                [
                    ">=",
                    "0.1.1"
                ]
            ]
        },
        {
            "name": "pyyaml",
            "specs": [
                [
                    ">=",
                    "6.0"
                ]
            ]
        },
        {
            "name": "jsonschema",
            "specs": [
                [
                    ">=",
                    "4.17.3"
                ]
            ]
        },
        {
            "name": "appdirs",
            "specs": [
                [
                    ">=",
                    "1.4.4"
                ]
            ]
        },
        {
            "name": "networkx",
            "specs": [
                [
                    ">=",
                    "3.0"
                ]
            ]
        },
        {
            "name": "psutil",
            "specs": [
                [
                    ">=",
                    "5.9.0"
                ]
            ]
        },
        {
            "name": "llama-cpp-python",
            "specs": [
                [
                    ">=",
                    "0.2.0"
                ]
            ]
        },
        {
            "name": "ctransformers",
            "specs": [
                [
                    ">=",
                    "0.2.0"
                ]
            ]
        },
        {
            "name": "pynvml",
            "specs": [
                [
                    ">=",
                    "11.4.1"
                ]
            ]
        },
        {
            "name": "PyPDF2",
            "specs": [
                [
                    ">=",
                    "3.0.0"
                ]
            ]
        },
        {
            "name": "pdfplumber",
            "specs": [
                [
                    ">=",
                    "0.11.0"
                ]
            ]
        },
        {
            "name": "python-docx",
            "specs": [
                [
                    ">=",
                    "0.8.11"
                ]
            ]
        },
        {
            "name": "google-auth",
            "specs": [
                [
                    ">=",
                    "2.17.0"
                ]
            ]
        },
        {
            "name": "google-auth-oauthlib",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        },
        {
            "name": "google-api-python-client",
            "specs": [
                [
                    ">=",
                    "2.88.0"
                ]
            ]
        },
        {
            "name": "black",
            "specs": [
                [
                    ">=",
                    "23.1.0"
                ]
            ]
        },
        {
            "name": "isort",
            "specs": [
                [
                    ">=",
                    "5.12.0"
                ]
            ]
        },
        {
            "name": "flake8",
            "specs": [
                [
                    ">=",
                    "6.0.0"
                ]
            ]
        },
        {
            "name": "mypy",
            "specs": [
                [
                    ">=",
                    "1.0.0"
                ]
            ]
        }
    ],
    "lcname": "aikitx"
}
        
Elapsed time: 0.88084s