gcpds-cv-pykit


Namegcpds-cv-pykit JSON
Version 0.1.1 PyPI version JSON
download
home_pagehttps://github.com/UN-GCPDS/gcpds-cv-pykit
SummaryA comprehensive toolkit for computer vision and segmentation tasks
upload_time2025-07-11 22:59:54
maintainerGCPDS Team
docs_urlNone
authorGCPDS Team
requires_python>=3.8
licenseMIT
keywords computer vision segmentation deep learning pytorch unet medical imaging image processing machine learning artificial intelligence
VCS
bugtrack_url
requirements No requirements were recorded.
Travis-CI No Travis.
coveralls test coverage No coveralls.
            # GCPDS Computer Vision Python Kit

A comprehensive toolkit for computer vision and segmentation tasks, developed by the GCPDS Team. This package provides state-of-the-art tools for training, evaluating, and deploying segmentation models with support for various architectures, loss functions, and performance metrics.

## ๐Ÿš€ Features

- **Segmentation Models**: Support for UNet and other popular architectures
- **Multiple Loss Functions**: DICE, Cross Entropy, Focal Loss, and Tversky Loss
- **Performance Evaluation**: Comprehensive metrics including Dice, Jaccard, Sensitivity, and Specificity
- **Training Pipeline**: Complete training workflow with validation and monitoring
- **Experiment Tracking**: Integration with Weights & Biases (wandb)
- **Mixed Precision Training**: Automatic Mixed Precision (AMP) support for faster training
- **Flexible Configuration**: YAML/JSON-based configuration system
- **Visualization Tools**: Built-in visualization utilities for model predictions
- **Memory Management**: Efficient memory handling and cleanup utilities

## ๐Ÿ“‹ Requirements

- Python >= 3.8
- PyTorch >= 2.0.0
- CUDA-compatible GPU (recommended)

## ๐Ÿ”ง Installation

### From PyPI (when available)
```bash
pip install gcpds-cv-pykit
```

### From Source
```bash
git clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git
cd gcpds-cv-pykit
pip install -e .
```

### Development Installation
```bash
git clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git
cd gcpds-cv-pykit
pip install -e ".[dev,docs]"
```

## ๐Ÿ“ฆ Dependencies

### Core Dependencies
- `torch>=2.0.0` - Deep learning framework
- `torchvision>=0.15.0` - Computer vision utilities
- `numpy>=1.21.0` - Numerical computing
- `opencv-python>=4.6.0` - Image processing
- `matplotlib>=3.5.0` - Plotting and visualization
- `wandb>=0.15.0` - Experiment tracking
- `tqdm>=4.64.0` - Progress bars
- `Pillow>=9.0.0` - Image handling
- `scipy>=1.9.0` - Scientific computing
- `pandas>=1.4.0` - Data manipulation

### Optional Dependencies
- **Development**: `pytest`, `black`, `flake8`, `isort`
- **Documentation**: `sphinx`, `sphinx-rtd-theme`

## ๐Ÿ—๏ธ Project Structure

```
gcpds_cv_pykit/
โ”œโ”€โ”€ baseline/
โ”‚   โ”œโ”€โ”€ trainers/           # Training utilities
โ”‚   โ”œโ”€โ”€ models/             # Model architectures
โ”‚   โ”œโ”€โ”€ losses/             # Loss functions
โ”‚   โ”œโ”€โ”€ dataloaders/        # Data loading utilities
โ”‚   โ””โ”€โ”€ performance_model.py # Model evaluation
โ”œโ”€โ”€ crowd/                  # Crowd-specific implementations
โ”œโ”€โ”€ datasets/               # Dataset utilities
โ””โ”€โ”€ visuals/               # Visualization tools
```

## ๐Ÿš€ Quick Start

### Basic Training Example

```python
from gcpds_cv_pykit.baseline.trainers import SegmentationModel_Trainer
from torch.utils.data import DataLoader

# Define your configuration
config = {
    'Model': 'UNet',
    'Backbone': 'resnet34',
    'Number of classes': 2,
    'Loss Function': 'DICE',
    'Optimizer': 'Adam',
    'Learning Rate': 0.001,
    'Epochs': 100,
    'Batch Size': 8,
    'Input size': [3, 256, 256],
    'AMP': True,
    'Device': 'cuda'
}

# Initialize trainer
trainer = SegmentationModel_Trainer(
    train_loader=train_dataloader,
    valid_loader=valid_dataloader,
    config=config
)

# Start training
trainer.start()
```

### Model Evaluation Example

```python
from gcpds_cv_pykit.baseline import PerformanceModels

# Evaluate trained model
evaluator = PerformanceModels(
    model=trained_model,
    test_dataset=test_dataloader,
    config=config,
    save_results=True
)
```

### Custom Loss Function

```python
from gcpds_cv_pykit.baseline.losses import DICELoss, FocalLoss

# DICE Loss
dice_loss = DICELoss(smooth=1.0, reduction='mean')

# Focal Loss
focal_loss = FocalLoss(alpha=0.25, gamma=2.0, reduction='mean')
```

## ๐Ÿ“Š Supported Models

- **UNet**: Classic U-Net architecture with various backbone options
  - Backbones: ResNet, EfficientNet, and more
  - Pretrained weights support
  - Customizable activation functions

## ๐ŸŽฏ Loss Functions

- **DICE Loss**: Optimized for segmentation tasks
- **Cross Entropy**: Standard classification loss
- **Focal Loss**: Addresses class imbalance
- **Tversky Loss**: Generalization of Dice loss

## ๐Ÿ“ˆ Metrics

The toolkit provides comprehensive evaluation metrics:

- **Dice Coefficient**: Overlap-based similarity measure
- **Jaccard Index (IoU)**: Intersection over Union
- **Sensitivity (Recall)**: True positive rate
- **Specificity**: True negative rate

All metrics are calculated both globally and per-class.

## ๐Ÿ”ง Configuration

The toolkit uses dictionary-based configuration. Key parameters include:

```python
config = {
    # Model Configuration
    'Model': 'UNet',
    'Backbone': 'resnet34',
    'Pretrained': True,
    'Number of classes': 2,
    'Input size': [3, 256, 256],
    
    # Training Configuration
    'Loss Function': 'DICE',
    'Optimizer': 'Adam',
    'Learning Rate': 0.001,
    'Epochs': 100,
    'Batch Size': 8,
    
    # Advanced Options
    'AMP': True,  # Automatic Mixed Precision
    'Device': 'cuda',
    'Wandb monitoring': ['api_key', 'project_name', 'run_name']
}
```

## ๐Ÿ“Š Experiment Tracking

Integration with Weights & Biases for experiment tracking:

```python
config['Wandb monitoring'] = [
    'your_wandb_api_key',
    'your_project_name',
    'experiment_name'
]
```

## ๐ŸŽจ Visualization

Built-in visualization tools for:
- Training/validation curves
- Model predictions vs ground truth
- Metric evolution across epochs
- Sample predictions

## ๐Ÿงช Testing

Run the test suite:

```bash
# Run all tests
pytest

# Run with coverage
pytest --cov=gcpds_cv_pykit

# Run specific test file
pytest tests/test_models.py
```

## ๐Ÿ“š Documentation

Build documentation locally:

```bash
cd docs
make html
```

## ๐Ÿค Contributing

We welcome contributions! Please follow these steps:

1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request

### Development Setup

```bash
# Clone the repository
git clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git
cd gcpds-cv-pykit

# Install in development mode
pip install -e ".[dev]"

# Run code formatting
black gcpds_cv_pykit/
isort gcpds_cv_pykit/

# Run linting
flake8 gcpds_cv_pykit/
```

## ๐Ÿ“„ License

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## ๐Ÿ‘ฅ Authors

- **GCPDS Team** - *Initial work* - [gcpds_man@unal.edu.co](mailto:gcpds_man@unal.edu.co)

## ๐Ÿ™ Acknowledgments

- PyTorch team for the excellent deep learning framework
- The computer vision community for inspiration and best practices
- Contributors and users of this toolkit

## ๐Ÿ“ž Support

- **Issues**: [GitHub Issues](https://github.com/UN-GCPDS/gcpds-cv-pykit/issues)
- **Documentation**: [Read the Docs](https://gcpds-cv-pykit.readthedocs.io/)
- **Email**: your-email@example.com

## ๐Ÿ”„ Changelog

### Version 0.1.0 (Alpha)
- Initial release
- Basic UNet implementation
- Core loss functions
- Training and evaluation pipeline
- Weights & Biases integration

---

**Note**: This project is in active development. APIs may change between versions. Please check the documentation for the latest updates.

            

Raw data

            {
    "_id": null,
    "home_page": "https://github.com/UN-GCPDS/gcpds-cv-pykit",
    "name": "gcpds-cv-pykit",
    "maintainer": "GCPDS Team",
    "docs_url": null,
    "requires_python": ">=3.8",
    "maintainer_email": "gcpds_man@unal.edu.co",
    "keywords": "computer vision, segmentation, deep learning, pytorch, unet, medical imaging, image processing, machine learning, artificial intelligence",
    "author": "GCPDS Team",
    "author_email": "gcpds_man@unal.edu.co",
    "download_url": "https://files.pythonhosted.org/packages/de/78/0b5cffaada6a77fba4e225b5bcf742d17de863c644a430a53bbcb7824a3b/gcpds-cv-pykit-0.1.1.tar.gz",
    "platform": "any",
    "description": "# GCPDS Computer Vision Python Kit\r\n\r\nA comprehensive toolkit for computer vision and segmentation tasks, developed by the GCPDS Team. This package provides state-of-the-art tools for training, evaluating, and deploying segmentation models with support for various architectures, loss functions, and performance metrics.\r\n\r\n## \ud83d\ude80 Features\r\n\r\n- **Segmentation Models**: Support for UNet and other popular architectures\r\n- **Multiple Loss Functions**: DICE, Cross Entropy, Focal Loss, and Tversky Loss\r\n- **Performance Evaluation**: Comprehensive metrics including Dice, Jaccard, Sensitivity, and Specificity\r\n- **Training Pipeline**: Complete training workflow with validation and monitoring\r\n- **Experiment Tracking**: Integration with Weights & Biases (wandb)\r\n- **Mixed Precision Training**: Automatic Mixed Precision (AMP) support for faster training\r\n- **Flexible Configuration**: YAML/JSON-based configuration system\r\n- **Visualization Tools**: Built-in visualization utilities for model predictions\r\n- **Memory Management**: Efficient memory handling and cleanup utilities\r\n\r\n## \ud83d\udccb Requirements\r\n\r\n- Python >= 3.8\r\n- PyTorch >= 2.0.0\r\n- CUDA-compatible GPU (recommended)\r\n\r\n## \ud83d\udd27 Installation\r\n\r\n### From PyPI (when available)\r\n```bash\r\npip install gcpds-cv-pykit\r\n```\r\n\r\n### From Source\r\n```bash\r\ngit clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git\r\ncd gcpds-cv-pykit\r\npip install -e .\r\n```\r\n\r\n### Development Installation\r\n```bash\r\ngit clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git\r\ncd gcpds-cv-pykit\r\npip install -e \".[dev,docs]\"\r\n```\r\n\r\n## \ud83d\udce6 Dependencies\r\n\r\n### Core Dependencies\r\n- `torch>=2.0.0` - Deep learning framework\r\n- `torchvision>=0.15.0` - Computer vision utilities\r\n- `numpy>=1.21.0` - Numerical computing\r\n- `opencv-python>=4.6.0` - Image processing\r\n- `matplotlib>=3.5.0` - Plotting and visualization\r\n- `wandb>=0.15.0` - Experiment tracking\r\n- `tqdm>=4.64.0` - Progress bars\r\n- `Pillow>=9.0.0` - Image handling\r\n- `scipy>=1.9.0` - Scientific computing\r\n- `pandas>=1.4.0` - Data manipulation\r\n\r\n### Optional Dependencies\r\n- **Development**: `pytest`, `black`, `flake8`, `isort`\r\n- **Documentation**: `sphinx`, `sphinx-rtd-theme`\r\n\r\n## \ud83c\udfd7\ufe0f Project Structure\r\n\r\n```\r\ngcpds_cv_pykit/\r\n\u251c\u2500\u2500 baseline/\r\n\u2502   \u251c\u2500\u2500 trainers/           # Training utilities\r\n\u2502   \u251c\u2500\u2500 models/             # Model architectures\r\n\u2502   \u251c\u2500\u2500 losses/             # Loss functions\r\n\u2502   \u251c\u2500\u2500 dataloaders/        # Data loading utilities\r\n\u2502   \u2514\u2500\u2500 performance_model.py # Model evaluation\r\n\u251c\u2500\u2500 crowd/                  # Crowd-specific implementations\r\n\u251c\u2500\u2500 datasets/               # Dataset utilities\r\n\u2514\u2500\u2500 visuals/               # Visualization tools\r\n```\r\n\r\n## \ud83d\ude80 Quick Start\r\n\r\n### Basic Training Example\r\n\r\n```python\r\nfrom gcpds_cv_pykit.baseline.trainers import SegmentationModel_Trainer\r\nfrom torch.utils.data import DataLoader\r\n\r\n# Define your configuration\r\nconfig = {\r\n    'Model': 'UNet',\r\n    'Backbone': 'resnet34',\r\n    'Number of classes': 2,\r\n    'Loss Function': 'DICE',\r\n    'Optimizer': 'Adam',\r\n    'Learning Rate': 0.001,\r\n    'Epochs': 100,\r\n    'Batch Size': 8,\r\n    'Input size': [3, 256, 256],\r\n    'AMP': True,\r\n    'Device': 'cuda'\r\n}\r\n\r\n# Initialize trainer\r\ntrainer = SegmentationModel_Trainer(\r\n    train_loader=train_dataloader,\r\n    valid_loader=valid_dataloader,\r\n    config=config\r\n)\r\n\r\n# Start training\r\ntrainer.start()\r\n```\r\n\r\n### Model Evaluation Example\r\n\r\n```python\r\nfrom gcpds_cv_pykit.baseline import PerformanceModels\r\n\r\n# Evaluate trained model\r\nevaluator = PerformanceModels(\r\n    model=trained_model,\r\n    test_dataset=test_dataloader,\r\n    config=config,\r\n    save_results=True\r\n)\r\n```\r\n\r\n### Custom Loss Function\r\n\r\n```python\r\nfrom gcpds_cv_pykit.baseline.losses import DICELoss, FocalLoss\r\n\r\n# DICE Loss\r\ndice_loss = DICELoss(smooth=1.0, reduction='mean')\r\n\r\n# Focal Loss\r\nfocal_loss = FocalLoss(alpha=0.25, gamma=2.0, reduction='mean')\r\n```\r\n\r\n## \ud83d\udcca Supported Models\r\n\r\n- **UNet**: Classic U-Net architecture with various backbone options\r\n  - Backbones: ResNet, EfficientNet, and more\r\n  - Pretrained weights support\r\n  - Customizable activation functions\r\n\r\n## \ud83c\udfaf Loss Functions\r\n\r\n- **DICE Loss**: Optimized for segmentation tasks\r\n- **Cross Entropy**: Standard classification loss\r\n- **Focal Loss**: Addresses class imbalance\r\n- **Tversky Loss**: Generalization of Dice loss\r\n\r\n## \ud83d\udcc8 Metrics\r\n\r\nThe toolkit provides comprehensive evaluation metrics:\r\n\r\n- **Dice Coefficient**: Overlap-based similarity measure\r\n- **Jaccard Index (IoU)**: Intersection over Union\r\n- **Sensitivity (Recall)**: True positive rate\r\n- **Specificity**: True negative rate\r\n\r\nAll metrics are calculated both globally and per-class.\r\n\r\n## \ud83d\udd27 Configuration\r\n\r\nThe toolkit uses dictionary-based configuration. Key parameters include:\r\n\r\n```python\r\nconfig = {\r\n    # Model Configuration\r\n    'Model': 'UNet',\r\n    'Backbone': 'resnet34',\r\n    'Pretrained': True,\r\n    'Number of classes': 2,\r\n    'Input size': [3, 256, 256],\r\n    \r\n    # Training Configuration\r\n    'Loss Function': 'DICE',\r\n    'Optimizer': 'Adam',\r\n    'Learning Rate': 0.001,\r\n    'Epochs': 100,\r\n    'Batch Size': 8,\r\n    \r\n    # Advanced Options\r\n    'AMP': True,  # Automatic Mixed Precision\r\n    'Device': 'cuda',\r\n    'Wandb monitoring': ['api_key', 'project_name', 'run_name']\r\n}\r\n```\r\n\r\n## \ud83d\udcca Experiment Tracking\r\n\r\nIntegration with Weights & Biases for experiment tracking:\r\n\r\n```python\r\nconfig['Wandb monitoring'] = [\r\n    'your_wandb_api_key',\r\n    'your_project_name',\r\n    'experiment_name'\r\n]\r\n```\r\n\r\n## \ud83c\udfa8 Visualization\r\n\r\nBuilt-in visualization tools for:\r\n- Training/validation curves\r\n- Model predictions vs ground truth\r\n- Metric evolution across epochs\r\n- Sample predictions\r\n\r\n## \ud83e\uddea Testing\r\n\r\nRun the test suite:\r\n\r\n```bash\r\n# Run all tests\r\npytest\r\n\r\n# Run with coverage\r\npytest --cov=gcpds_cv_pykit\r\n\r\n# Run specific test file\r\npytest tests/test_models.py\r\n```\r\n\r\n## \ud83d\udcda Documentation\r\n\r\nBuild documentation locally:\r\n\r\n```bash\r\ncd docs\r\nmake html\r\n```\r\n\r\n## \ud83e\udd1d Contributing\r\n\r\nWe welcome contributions! Please follow these steps:\r\n\r\n1. Fork the repository\r\n2. Create a feature branch (`git checkout -b feature/amazing-feature`)\r\n3. Commit your changes (`git commit -m 'Add amazing feature'`)\r\n4. Push to the branch (`git push origin feature/amazing-feature`)\r\n5. Open a Pull Request\r\n\r\n### Development Setup\r\n\r\n```bash\r\n# Clone the repository\r\ngit clone https://github.com/UN-GCPDS/gcpds-cv-pykit.git\r\ncd gcpds-cv-pykit\r\n\r\n# Install in development mode\r\npip install -e \".[dev]\"\r\n\r\n# Run code formatting\r\nblack gcpds_cv_pykit/\r\nisort gcpds_cv_pykit/\r\n\r\n# Run linting\r\nflake8 gcpds_cv_pykit/\r\n```\r\n\r\n## \ud83d\udcc4 License\r\n\r\nThis project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.\r\n\r\n## \ud83d\udc65 Authors\r\n\r\n- **GCPDS Team** - *Initial work* - [gcpds_man@unal.edu.co](mailto:gcpds_man@unal.edu.co)\r\n\r\n## \ud83d\ude4f Acknowledgments\r\n\r\n- PyTorch team for the excellent deep learning framework\r\n- The computer vision community for inspiration and best practices\r\n- Contributors and users of this toolkit\r\n\r\n## \ud83d\udcde Support\r\n\r\n- **Issues**: [GitHub Issues](https://github.com/UN-GCPDS/gcpds-cv-pykit/issues)\r\n- **Documentation**: [Read the Docs](https://gcpds-cv-pykit.readthedocs.io/)\r\n- **Email**: your-email@example.com\r\n\r\n## \ud83d\udd04 Changelog\r\n\r\n### Version 0.1.0 (Alpha)\r\n- Initial release\r\n- Basic UNet implementation\r\n- Core loss functions\r\n- Training and evaluation pipeline\r\n- Weights & Biases integration\r\n\r\n---\r\n\r\n**Note**: This project is in active development. APIs may change between versions. Please check the documentation for the latest updates.\r\n",
    "bugtrack_url": null,
    "license": "MIT",
    "summary": "A comprehensive toolkit for computer vision and segmentation tasks",
    "version": "0.1.1",
    "project_urls": {
        "Bug Reports": "https://github.com/UN-GCPDS/gcpds-cv-pykit/issues",
        "Documentation": "https://gcpds-cv-pykit.readthedocs.io/",
        "Homepage": "https://github.com/UN-GCPDS/gcpds-cv-pykit",
        "Source": "https://github.com/UN-GCPDS/gcpds-cv-pykit"
    },
    "split_keywords": [
        "computer vision",
        " segmentation",
        " deep learning",
        " pytorch",
        " unet",
        " medical imaging",
        " image processing",
        " machine learning",
        " artificial intelligence"
    ],
    "urls": [
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "732ce6a7b89011951cc18ec415c66b7a77a54809984489d319a7da9f0e43485e",
                "md5": "549ebdeab1afc36226659e34815287b5",
                "sha256": "12eae8269836b402c5fcaaa589bcc17e56bcef5c2f4fd1c03987583daddb3fc5"
            },
            "downloads": -1,
            "filename": "gcpds_cv_pykit-0.1.1-py3-none-any.whl",
            "has_sig": false,
            "md5_digest": "549ebdeab1afc36226659e34815287b5",
            "packagetype": "bdist_wheel",
            "python_version": "py3",
            "requires_python": ">=3.8",
            "size": 26209,
            "upload_time": "2025-07-11T22:59:52",
            "upload_time_iso_8601": "2025-07-11T22:59:52.904001Z",
            "url": "https://files.pythonhosted.org/packages/73/2c/e6a7b89011951cc18ec415c66b7a77a54809984489d319a7da9f0e43485e/gcpds_cv_pykit-0.1.1-py3-none-any.whl",
            "yanked": false,
            "yanked_reason": null
        },
        {
            "comment_text": null,
            "digests": {
                "blake2b_256": "de780b5cffaada6a77fba4e225b5bcf742d17de863c644a430a53bbcb7824a3b",
                "md5": "c98cda85dde3b1a80f04879be098f212",
                "sha256": "ab9ead8bb66b801dbf90fb171ea89042622dec0f840cc3bc185fba2f6b1dade4"
            },
            "downloads": -1,
            "filename": "gcpds-cv-pykit-0.1.1.tar.gz",
            "has_sig": false,
            "md5_digest": "c98cda85dde3b1a80f04879be098f212",
            "packagetype": "sdist",
            "python_version": "source",
            "requires_python": ">=3.8",
            "size": 25984,
            "upload_time": "2025-07-11T22:59:54",
            "upload_time_iso_8601": "2025-07-11T22:59:54.559987Z",
            "url": "https://files.pythonhosted.org/packages/de/78/0b5cffaada6a77fba4e225b5bcf742d17de863c644a430a53bbcb7824a3b/gcpds-cv-pykit-0.1.1.tar.gz",
            "yanked": false,
            "yanked_reason": null
        }
    ],
    "upload_time": "2025-07-11 22:59:54",
    "github": true,
    "gitlab": false,
    "bitbucket": false,
    "codeberg": false,
    "github_user": "UN-GCPDS",
    "github_project": "gcpds-cv-pykit",
    "travis_ci": false,
    "coveralls": false,
    "github_actions": false,
    "lcname": "gcpds-cv-pykit"
}
        
Elapsed time: 1.14025s